[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 27885 1726882527.04126: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-spT executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 27885 1726882527.04462: Added group all to inventory 27885 1726882527.04464: Added group ungrouped to inventory 27885 1726882527.04467: Group all now contains ungrouped 27885 1726882527.04470: Examining possible inventory source: /tmp/network-Kc3/inventory.yml 27885 1726882527.15979: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 27885 1726882527.16048: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 27885 1726882527.16075: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 27885 1726882527.16122: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 27885 1726882527.16168: Loaded config def from plugin (inventory/script) 27885 1726882527.16170: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 27885 1726882527.16203: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 27885 1726882527.16257: Loaded config def from plugin (inventory/yaml) 27885 1726882527.16259: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 27885 1726882527.16327: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 27885 1726882527.16605: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 27885 1726882527.16608: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 27885 1726882527.16610: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 27885 1726882527.16614: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 27885 1726882527.16617: Loading data from /tmp/network-Kc3/inventory.yml 27885 1726882527.16659: /tmp/network-Kc3/inventory.yml was not parsable by auto 27885 1726882527.16704: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 27885 1726882527.16734: Loading data from /tmp/network-Kc3/inventory.yml 27885 1726882527.16784: group all already in inventory 27885 1726882527.16789: set inventory_file for managed_node1 27885 1726882527.16796: set inventory_dir for managed_node1 27885 1726882527.16797: Added host managed_node1 to inventory 27885 1726882527.16799: Added host managed_node1 to group all 27885 1726882527.16799: set ansible_host for managed_node1 27885 1726882527.16800: set ansible_ssh_extra_args for managed_node1 27885 1726882527.16802: set inventory_file for managed_node2 27885 1726882527.16804: set inventory_dir for managed_node2 27885 1726882527.16804: Added host managed_node2 to inventory 27885 1726882527.16805: Added host managed_node2 to group all 27885 1726882527.16806: set ansible_host for managed_node2 27885 1726882527.16806: set ansible_ssh_extra_args for managed_node2 27885 1726882527.16808: set inventory_file for managed_node3 27885 1726882527.16809: set inventory_dir for managed_node3 27885 1726882527.16810: Added host managed_node3 to inventory 27885 1726882527.16810: Added host managed_node3 to group all 27885 1726882527.16811: set ansible_host for managed_node3 27885 1726882527.16811: set ansible_ssh_extra_args for managed_node3 27885 1726882527.16817: Reconcile groups and hosts in inventory. 27885 1726882527.16821: Group ungrouped now contains managed_node1 27885 1726882527.16823: Group ungrouped now contains managed_node2 27885 1726882527.16826: Group ungrouped now contains managed_node3 27885 1726882527.16925: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 27885 1726882527.17050: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 27885 1726882527.17108: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 27885 1726882527.17136: Loaded config def from plugin (vars/host_group_vars) 27885 1726882527.17139: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 27885 1726882527.17146: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 27885 1726882527.17155: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 27885 1726882527.17204: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 27885 1726882527.17531: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882527.17627: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 27885 1726882527.17665: Loaded config def from plugin (connection/local) 27885 1726882527.17669: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 27885 1726882527.18078: Loaded config def from plugin (connection/paramiko_ssh) 27885 1726882527.18081: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 27885 1726882527.18635: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 27885 1726882527.18657: Loaded config def from plugin (connection/psrp) 27885 1726882527.18659: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 27885 1726882527.19057: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 27885 1726882527.19080: Loaded config def from plugin (connection/ssh) 27885 1726882527.19081: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 27885 1726882527.20711: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 27885 1726882527.20749: Loaded config def from plugin (connection/winrm) 27885 1726882527.20752: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 27885 1726882527.20781: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 27885 1726882527.20843: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 27885 1726882527.20901: Loaded config def from plugin (shell/cmd) 27885 1726882527.20903: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 27885 1726882527.20925: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 27885 1726882527.20991: Loaded config def from plugin (shell/powershell) 27885 1726882527.20995: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 27885 1726882527.21055: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 27885 1726882527.21240: Loaded config def from plugin (shell/sh) 27885 1726882527.21242: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 27885 1726882527.21275: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 27885 1726882527.21405: Loaded config def from plugin (become/runas) 27885 1726882527.21407: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 27885 1726882527.21596: Loaded config def from plugin (become/su) 27885 1726882527.21598: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 27885 1726882527.21756: Loaded config def from plugin (become/sudo) 27885 1726882527.21758: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 27885 1726882527.21794: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_device_nm.yml 27885 1726882527.22127: in VariableManager get_vars() 27885 1726882527.22149: done with get_vars() 27885 1726882527.22274: trying /usr/local/lib/python3.12/site-packages/ansible/modules 27885 1726882527.24376: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 27885 1726882527.24489: in VariableManager get_vars() 27885 1726882527.24498: done with get_vars() 27885 1726882527.24501: variable 'playbook_dir' from source: magic vars 27885 1726882527.24502: variable 'ansible_playbook_python' from source: magic vars 27885 1726882527.24503: variable 'ansible_config_file' from source: magic vars 27885 1726882527.24503: variable 'groups' from source: magic vars 27885 1726882527.24504: variable 'omit' from source: magic vars 27885 1726882527.24505: variable 'ansible_version' from source: magic vars 27885 1726882527.24505: variable 'ansible_check_mode' from source: magic vars 27885 1726882527.24506: variable 'ansible_diff_mode' from source: magic vars 27885 1726882527.24507: variable 'ansible_forks' from source: magic vars 27885 1726882527.24508: variable 'ansible_inventory_sources' from source: magic vars 27885 1726882527.24508: variable 'ansible_skip_tags' from source: magic vars 27885 1726882527.24509: variable 'ansible_limit' from source: magic vars 27885 1726882527.24510: variable 'ansible_run_tags' from source: magic vars 27885 1726882527.24510: variable 'ansible_verbosity' from source: magic vars 27885 1726882527.24543: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml 27885 1726882527.25328: in VariableManager get_vars() 27885 1726882527.25338: done with get_vars() 27885 1726882527.25366: in VariableManager get_vars() 27885 1726882527.25374: done with get_vars() 27885 1726882527.25402: in VariableManager get_vars() 27885 1726882527.25410: done with get_vars() 27885 1726882527.25475: in VariableManager get_vars() 27885 1726882527.25484: done with get_vars() 27885 1726882527.25512: in VariableManager get_vars() 27885 1726882527.25521: done with get_vars() 27885 1726882527.25550: in VariableManager get_vars() 27885 1726882527.25557: done with get_vars() 27885 1726882527.25594: in VariableManager get_vars() 27885 1726882527.25603: done with get_vars() 27885 1726882527.25607: variable 'omit' from source: magic vars 27885 1726882527.25620: variable 'omit' from source: magic vars 27885 1726882527.25639: in VariableManager get_vars() 27885 1726882527.25645: done with get_vars() 27885 1726882527.25673: in VariableManager get_vars() 27885 1726882527.25681: done with get_vars() 27885 1726882527.25708: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 27885 1726882527.25835: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 27885 1726882527.25912: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 27885 1726882527.26278: in VariableManager get_vars() 27885 1726882527.26292: done with get_vars() 27885 1726882527.26570: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 27885 1726882527.26657: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 27885 1726882527.28555: in VariableManager get_vars() 27885 1726882527.28575: done with get_vars() 27885 1726882527.28580: variable 'omit' from source: magic vars 27885 1726882527.28596: variable 'omit' from source: magic vars 27885 1726882527.28631: in VariableManager get_vars() 27885 1726882527.28646: done with get_vars() 27885 1726882527.28666: in VariableManager get_vars() 27885 1726882527.28681: done with get_vars() 27885 1726882527.28715: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 27885 1726882527.28823: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 27885 1726882527.28888: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 27885 1726882527.31035: in VariableManager get_vars() 27885 1726882527.31059: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 27885 1726882527.33175: in VariableManager get_vars() 27885 1726882527.33200: done with get_vars() 27885 1726882527.33238: in VariableManager get_vars() 27885 1726882527.33255: done with get_vars() 27885 1726882527.33384: in VariableManager get_vars() 27885 1726882527.33408: done with get_vars() 27885 1726882527.33445: in VariableManager get_vars() 27885 1726882527.33462: done with get_vars() 27885 1726882527.33503: in VariableManager get_vars() 27885 1726882527.33522: done with get_vars() 27885 1726882527.33559: in VariableManager get_vars() 27885 1726882527.33577: done with get_vars() 27885 1726882527.33641: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 27885 1726882527.33655: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 27885 1726882527.33895: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 27885 1726882527.34061: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 27885 1726882527.34064: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-spT/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 27885 1726882527.34100: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 27885 1726882527.34125: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 27885 1726882527.34299: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 27885 1726882527.34359: Loaded config def from plugin (callback/default) 27885 1726882527.34362: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 27885 1726882527.35492: Loaded config def from plugin (callback/junit) 27885 1726882527.35497: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 27885 1726882527.35539: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 27885 1726882527.35605: Loaded config def from plugin (callback/minimal) 27885 1726882527.35608: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 27885 1726882527.35647: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 27885 1726882527.35708: Loaded config def from plugin (callback/tree) 27885 1726882527.35711: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 27885 1726882527.35837: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 27885 1726882527.35839: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-spT/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_route_device_nm.yml ******************************************** 2 plays in /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_device_nm.yml 27885 1726882527.35865: in VariableManager get_vars() 27885 1726882527.35876: done with get_vars() 27885 1726882527.35882: in VariableManager get_vars() 27885 1726882527.35895: done with get_vars() 27885 1726882527.35899: variable 'omit' from source: magic vars 27885 1726882527.35935: in VariableManager get_vars() 27885 1726882527.35948: done with get_vars() 27885 1726882527.35968: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_route_device.yml' with nm as provider] ***** 27885 1726882527.36508: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 27885 1726882527.36579: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 27885 1726882527.36615: getting the remaining hosts for this loop 27885 1726882527.36617: done getting the remaining hosts for this loop 27885 1726882527.36619: getting the next task for host managed_node2 27885 1726882527.36623: done getting next task for host managed_node2 27885 1726882527.36624: ^ task is: TASK: Gathering Facts 27885 1726882527.36626: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882527.36628: getting variables 27885 1726882527.36629: in VariableManager get_vars() 27885 1726882527.36638: Calling all_inventory to load vars for managed_node2 27885 1726882527.36641: Calling groups_inventory to load vars for managed_node2 27885 1726882527.36643: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882527.36654: Calling all_plugins_play to load vars for managed_node2 27885 1726882527.36666: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882527.36669: Calling groups_plugins_play to load vars for managed_node2 27885 1726882527.36706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882527.36760: done with get_vars() 27885 1726882527.36766: done getting variables 27885 1726882527.36833: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_device_nm.yml:6 Friday 20 September 2024 21:35:27 -0400 (0:00:00.011) 0:00:00.011 ****** 27885 1726882527.36854: entering _queue_task() for managed_node2/gather_facts 27885 1726882527.36855: Creating lock for gather_facts 27885 1726882527.37226: worker is 1 (out of 1 available) 27885 1726882527.37237: exiting _queue_task() for managed_node2/gather_facts 27885 1726882527.37262: done queuing things up, now waiting for results queue to drain 27885 1726882527.37263: waiting for pending results... 27885 1726882527.37552: running TaskExecutor() for managed_node2/TASK: Gathering Facts 27885 1726882527.37557: in run() - task 12673a56-9f93-3fa5-01be-0000000000bf 27885 1726882527.37560: variable 'ansible_search_path' from source: unknown 27885 1726882527.37591: calling self._execute() 27885 1726882527.37663: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882527.37675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882527.37686: variable 'omit' from source: magic vars 27885 1726882527.37792: variable 'omit' from source: magic vars 27885 1726882527.37827: variable 'omit' from source: magic vars 27885 1726882527.37874: variable 'omit' from source: magic vars 27885 1726882527.37922: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882527.37968: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882527.37996: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882527.38020: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882527.38037: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882527.38070: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882527.38086: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882527.38190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882527.38208: Set connection var ansible_pipelining to False 27885 1726882527.38219: Set connection var ansible_connection to ssh 27885 1726882527.38229: Set connection var ansible_timeout to 10 27885 1726882527.38235: Set connection var ansible_shell_type to sh 27885 1726882527.38243: Set connection var ansible_shell_executable to /bin/sh 27885 1726882527.38251: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882527.38276: variable 'ansible_shell_executable' from source: unknown 27885 1726882527.38283: variable 'ansible_connection' from source: unknown 27885 1726882527.38296: variable 'ansible_module_compression' from source: unknown 27885 1726882527.38306: variable 'ansible_shell_type' from source: unknown 27885 1726882527.38313: variable 'ansible_shell_executable' from source: unknown 27885 1726882527.38319: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882527.38327: variable 'ansible_pipelining' from source: unknown 27885 1726882527.38334: variable 'ansible_timeout' from source: unknown 27885 1726882527.38342: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882527.38530: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882527.38546: variable 'omit' from source: magic vars 27885 1726882527.38556: starting attempt loop 27885 1726882527.38563: running the handler 27885 1726882527.38584: variable 'ansible_facts' from source: unknown 27885 1726882527.38610: _low_level_execute_command(): starting 27885 1726882527.38700: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27885 1726882527.39370: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882527.39401: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882527.39417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882527.39491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882527.39509: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882527.39553: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882527.39572: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882527.39609: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882527.39720: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882527.41444: stdout chunk (state=3): >>>/root <<< 27885 1726882527.41589: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882527.41631: stdout chunk (state=3): >>><<< 27885 1726882527.41634: stderr chunk (state=3): >>><<< 27885 1726882527.41656: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882527.41764: _low_level_execute_command(): starting 27885 1726882527.41768: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882527.4166353-27914-246816898720461 `" && echo ansible-tmp-1726882527.4166353-27914-246816898720461="` echo /root/.ansible/tmp/ansible-tmp-1726882527.4166353-27914-246816898720461 `" ) && sleep 0' 27885 1726882527.42332: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882527.42347: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882527.42410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882527.42476: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882527.42502: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882527.42516: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882527.42606: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882527.44476: stdout chunk (state=3): >>>ansible-tmp-1726882527.4166353-27914-246816898720461=/root/.ansible/tmp/ansible-tmp-1726882527.4166353-27914-246816898720461 <<< 27885 1726882527.44610: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882527.44619: stdout chunk (state=3): >>><<< 27885 1726882527.44626: stderr chunk (state=3): >>><<< 27885 1726882527.44721: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882527.4166353-27914-246816898720461=/root/.ansible/tmp/ansible-tmp-1726882527.4166353-27914-246816898720461 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882527.44725: variable 'ansible_module_compression' from source: unknown 27885 1726882527.44727: ANSIBALLZ: Using generic lock for ansible.legacy.setup 27885 1726882527.44730: ANSIBALLZ: Acquiring lock 27885 1726882527.44732: ANSIBALLZ: Lock acquired: 140560087758944 27885 1726882527.44734: ANSIBALLZ: Creating module 27885 1726882527.71290: ANSIBALLZ: Writing module into payload 27885 1726882527.71499: ANSIBALLZ: Writing module 27885 1726882527.71502: ANSIBALLZ: Renaming module 27885 1726882527.71504: ANSIBALLZ: Done creating module 27885 1726882527.71506: variable 'ansible_facts' from source: unknown 27885 1726882527.71511: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882527.71524: _low_level_execute_command(): starting 27885 1726882527.71534: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 27885 1726882527.72129: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882527.72143: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882527.72157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882527.72173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882527.72188: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882527.72285: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882527.72299: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882527.72398: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882527.74015: stdout chunk (state=3): >>>PLATFORM <<< 27885 1726882527.74086: stdout chunk (state=3): >>>Linux <<< 27885 1726882527.74115: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 27885 1726882527.74311: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882527.74314: stdout chunk (state=3): >>><<< 27885 1726882527.74317: stderr chunk (state=3): >>><<< 27885 1726882527.74333: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882527.74349 [managed_node2]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 27885 1726882527.74421: _low_level_execute_command(): starting 27885 1726882527.74424: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 27885 1726882527.74726: Sending initial data 27885 1726882527.74729: Sent initial data (1181 bytes) 27885 1726882527.75038: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882527.75063: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882527.75078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882527.75100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882527.75119: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882527.75132: stderr chunk (state=3): >>>debug2: match not found <<< 27885 1726882527.75170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882527.75190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882527.75281: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882527.75296: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882527.75315: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882527.75340: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882527.75419: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882527.78798: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 27885 1726882527.79207: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882527.79218: stdout chunk (state=3): >>><<< 27885 1726882527.79229: stderr chunk (state=3): >>><<< 27885 1726882527.79249: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882527.79406: variable 'ansible_facts' from source: unknown 27885 1726882527.79409: variable 'ansible_facts' from source: unknown 27885 1726882527.79411: variable 'ansible_module_compression' from source: unknown 27885 1726882527.79413: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27885_1t3g5hz/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 27885 1726882527.79439: variable 'ansible_facts' from source: unknown 27885 1726882527.79643: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882527.4166353-27914-246816898720461/AnsiballZ_setup.py 27885 1726882527.79863: Sending initial data 27885 1726882527.79866: Sent initial data (154 bytes) 27885 1726882527.80547: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882527.80605: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882527.80617: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882527.80663: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882527.80719: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882527.82275: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27885 1726882527.82339: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27885 1726882527.82411: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpmbbg1_dm /root/.ansible/tmp/ansible-tmp-1726882527.4166353-27914-246816898720461/AnsiballZ_setup.py <<< 27885 1726882527.82427: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882527.4166353-27914-246816898720461/AnsiballZ_setup.py" <<< 27885 1726882527.82485: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpmbbg1_dm" to remote "/root/.ansible/tmp/ansible-tmp-1726882527.4166353-27914-246816898720461/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882527.4166353-27914-246816898720461/AnsiballZ_setup.py" <<< 27885 1726882527.84423: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882527.84427: stdout chunk (state=3): >>><<< 27885 1726882527.84429: stderr chunk (state=3): >>><<< 27885 1726882527.84432: done transferring module to remote 27885 1726882527.84434: _low_level_execute_command(): starting 27885 1726882527.84436: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882527.4166353-27914-246816898720461/ /root/.ansible/tmp/ansible-tmp-1726882527.4166353-27914-246816898720461/AnsiballZ_setup.py && sleep 0' 27885 1726882527.85044: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882527.85058: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882527.85073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882527.85098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882527.85114: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882527.85149: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882527.85198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882527.85212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882527.85262: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882527.85281: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882527.85315: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882527.85402: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882527.87198: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882527.87202: stdout chunk (state=3): >>><<< 27885 1726882527.87204: stderr chunk (state=3): >>><<< 27885 1726882527.87308: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882527.87311: _low_level_execute_command(): starting 27885 1726882527.87322: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882527.4166353-27914-246816898720461/AnsiballZ_setup.py && sleep 0' 27885 1726882527.87988: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882527.88005: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882527.88022: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882527.88119: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882527.90237: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 27885 1726882527.90248: stdout chunk (state=3): >>>import _imp # builtin <<< 27885 1726882527.90282: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 27885 1726882527.90348: stdout chunk (state=3): >>>import '_io' # <<< 27885 1726882527.90353: stdout chunk (state=3): >>>import 'marshal' # <<< 27885 1726882527.90381: stdout chunk (state=3): >>>import 'posix' # <<< 27885 1726882527.90421: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 27885 1726882527.90446: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # <<< 27885 1726882527.90449: stdout chunk (state=3): >>># installed zipimport hook <<< 27885 1726882527.90500: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 27885 1726882527.90508: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 27885 1726882527.90517: stdout chunk (state=3): >>>import '_codecs' # <<< 27885 1726882527.90540: stdout chunk (state=3): >>>import 'codecs' # <<< 27885 1726882527.90572: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 27885 1726882527.90606: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 27885 1726882527.90609: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1787bc4d0> <<< 27885 1726882527.90611: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff17878bb00> <<< 27885 1726882527.90635: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py <<< 27885 1726882527.90638: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 27885 1726882527.90650: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1787bea50> <<< 27885 1726882527.90669: stdout chunk (state=3): >>>import '_signal' # <<< 27885 1726882527.90691: stdout chunk (state=3): >>>import '_abc' # <<< 27885 1726882527.90701: stdout chunk (state=3): >>>import 'abc' # <<< 27885 1726882527.90717: stdout chunk (state=3): >>>import 'io' # <<< 27885 1726882527.90752: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 27885 1726882527.90838: stdout chunk (state=3): >>>import '_collections_abc' # <<< 27885 1726882527.90862: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 27885 1726882527.90886: stdout chunk (state=3): >>>import 'os' # <<< 27885 1726882527.90912: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 27885 1726882527.90919: stdout chunk (state=3): >>>Processing user site-packages <<< 27885 1726882527.90943: stdout chunk (state=3): >>>Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 27885 1726882527.90957: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 27885 1726882527.90978: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 27885 1726882527.90984: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 27885 1726882527.91006: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1787cd130> <<< 27885 1726882527.91066: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 27885 1726882527.91069: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 27885 1726882527.91083: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1787cdfa0> <<< 27885 1726882527.91102: stdout chunk (state=3): >>>import 'site' # <<< 27885 1726882527.91132: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 27885 1726882527.91496: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 27885 1726882527.91503: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 27885 1726882527.91525: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 27885 1726882527.91530: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 27885 1726882527.91552: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 27885 1726882527.91598: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 27885 1726882527.91611: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 27885 1726882527.91638: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 27885 1726882527.91641: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1785ebda0> <<< 27885 1726882527.91667: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 27885 1726882527.91681: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 27885 1726882527.91706: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1785ebfe0> <<< 27885 1726882527.91731: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 27885 1726882527.91752: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 27885 1726882527.91779: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 27885 1726882527.91825: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 27885 1726882527.91843: stdout chunk (state=3): >>>import 'itertools' # <<< 27885 1726882527.91865: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 27885 1726882527.91874: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1786237a0> <<< 27885 1726882527.91886: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 27885 1726882527.91904: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 27885 1726882527.91913: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff178623e30> <<< 27885 1726882527.91937: stdout chunk (state=3): >>>import '_collections' # <<< 27885 1726882527.91968: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff178603a70> <<< 27885 1726882527.91981: stdout chunk (state=3): >>>import '_functools' # <<< 27885 1726882527.92025: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff178601190> <<< 27885 1726882527.92177: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1785e8f50> <<< 27885 1726882527.92213: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 27885 1726882527.92216: stdout chunk (state=3): >>>import '_sre' # <<< 27885 1726882527.92219: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 27885 1726882527.92221: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 27885 1726882527.92247: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 27885 1726882527.92282: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff178643710> <<< 27885 1726882527.92318: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff178642330> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 27885 1726882527.92344: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff178602060> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1785ea810> <<< 27885 1726882527.92377: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1786787a0> <<< 27885 1726882527.92433: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1785e81d0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 27885 1726882527.92436: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 27885 1726882527.92476: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff178678c50> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff178678b00> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 27885 1726882527.92488: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff178678ec0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1785e6cf0> <<< 27885 1726882527.92516: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 27885 1726882527.92534: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 27885 1726882527.92601: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1786795b0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff178679280> <<< 27885 1726882527.92604: stdout chunk (state=3): >>>import 'importlib.machinery' # <<< 27885 1726882527.92643: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 27885 1726882527.92674: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff17867a4b0> <<< 27885 1726882527.92677: stdout chunk (state=3): >>>import 'importlib.util' # <<< 27885 1726882527.92704: stdout chunk (state=3): >>>import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 27885 1726882527.92715: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 27885 1726882527.92743: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 27885 1726882527.92785: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1786906e0> import 'errno' # <<< 27885 1726882527.92810: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff178691df0> <<< 27885 1726882527.92850: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 27885 1726882527.92908: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 27885 1726882527.92938: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff178692c60> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1786932c0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1786921b0> <<< 27885 1726882527.92969: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 27885 1726882527.93014: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff178693d40> <<< 27885 1726882527.93036: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff178693470> <<< 27885 1726882527.93054: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff17867a510> <<< 27885 1726882527.93071: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 27885 1726882527.93089: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 27885 1726882527.93101: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 27885 1726882527.93123: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 27885 1726882527.93156: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' <<< 27885 1726882527.93162: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff178393b90> <<< 27885 1726882527.93183: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py <<< 27885 1726882527.93192: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 27885 1726882527.93211: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 27885 1726882527.93222: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1783bc620> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1783bc3b0> <<< 27885 1726882527.93240: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1783bc650> <<< 27885 1726882527.93271: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 27885 1726882527.93278: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 27885 1726882527.93349: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 27885 1726882527.93464: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1783bcf80> <<< 27885 1726882527.93576: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 27885 1726882527.93589: stdout chunk (state=3): >>>import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1783bd970> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1783bc830> <<< 27885 1726882527.93597: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff178391d30> <<< 27885 1726882527.93618: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 27885 1726882527.93651: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 27885 1726882527.93675: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 27885 1726882527.93678: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1783bed20> <<< 27885 1726882527.93710: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1783bda90> <<< 27885 1726882527.93723: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff17867ac00> <<< 27885 1726882527.93755: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 27885 1726882527.93809: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 27885 1726882527.93831: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 27885 1726882527.93859: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 27885 1726882527.93892: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1783eb080> <<< 27885 1726882527.93944: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 27885 1726882527.93963: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 27885 1726882527.93977: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 27885 1726882527.94000: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 27885 1726882527.94036: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff17840b410> <<< 27885 1726882527.94062: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 27885 1726882527.94106: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 27885 1726882527.94160: stdout chunk (state=3): >>>import 'ntpath' # <<< 27885 1726882527.94177: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' <<< 27885 1726882527.94189: stdout chunk (state=3): >>>import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff17846c1d0> <<< 27885 1726882527.94201: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 27885 1726882527.94231: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 27885 1726882527.94253: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 27885 1726882527.94298: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 27885 1726882527.94377: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff17846e930> <<< 27885 1726882527.94450: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff17846c2f0> <<< 27885 1726882527.94479: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1784391f0> <<< 27885 1726882527.94514: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177d292e0> <<< 27885 1726882527.94533: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff17840a240> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1783bfc50> <<< 27885 1726882527.94712: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 27885 1726882527.94739: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7ff17840a330> <<< 27885 1726882527.94997: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_jk0z8lda/ansible_ansible.legacy.setup_payload.zip' <<< 27885 1726882527.95019: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882527.95123: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882527.95146: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 27885 1726882527.95198: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 27885 1726882527.95295: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 27885 1726882527.95320: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177d8afc0> import '_typing' # <<< 27885 1726882527.95485: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177d69eb0> <<< 27885 1726882527.95535: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177d69010> # zipimport: zlib available import 'ansible' # # zipimport: zlib available <<< 27885 1726882527.95586: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # <<< 27885 1726882527.95590: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882527.96953: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882527.98081: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 27885 1726882527.98084: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177d88e90> <<< 27885 1726882527.98123: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 27885 1726882527.98146: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 27885 1726882527.98165: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 27885 1726882527.98186: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 27885 1726882527.98210: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff177dbe9f0> <<< 27885 1726882527.98223: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177dbe780> <<< 27885 1726882527.98257: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177dbe090> <<< 27885 1726882527.98278: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 27885 1726882527.98319: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177dbe4e0> <<< 27885 1726882527.98325: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177d8bc50> import 'atexit' # <<< 27885 1726882527.98352: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 27885 1726882527.98358: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff177dbf770> <<< 27885 1726882527.98383: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff177dbf9b0> <<< 27885 1726882527.98403: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 27885 1726882527.98442: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 27885 1726882527.98458: stdout chunk (state=3): >>>import '_locale' # <<< 27885 1726882527.98524: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177dbfef0> import 'pwd' # <<< 27885 1726882527.98543: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 27885 1726882527.98559: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 27885 1726882527.98586: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177c25d00> <<< 27885 1726882527.98641: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff177c27920> <<< 27885 1726882527.98645: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 27885 1726882527.98654: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 27885 1726882527.98698: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177c282f0> <<< 27885 1726882527.98738: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 27885 1726882527.98742: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 27885 1726882527.98779: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177c29490> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 27885 1726882527.98813: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 27885 1726882527.98837: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 27885 1726882527.98878: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177c2bf80> <<< 27885 1726882527.98923: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff177c302c0> <<< 27885 1726882527.98942: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177c2a240> <<< 27885 1726882527.98971: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 27885 1726882527.99007: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 27885 1726882527.99010: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py <<< 27885 1726882527.99031: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 27885 1726882527.99118: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 27885 1726882527.99165: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177c33fb0> <<< 27885 1726882527.99171: stdout chunk (state=3): >>>import '_tokenize' # <<< 27885 1726882527.99235: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177c32a80> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177c327e0> <<< 27885 1726882527.99262: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 27885 1726882527.99352: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177c32d50> <<< 27885 1726882527.99373: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177c2a750> <<< 27885 1726882527.99429: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff177c78200> <<< 27885 1726882527.99450: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177c783b0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 27885 1726882527.99484: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 27885 1726882527.99488: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 27885 1726882527.99516: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so'<<< 27885 1726882527.99523: stdout chunk (state=3): >>> import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff177c79e50> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177c79be0> <<< 27885 1726882527.99534: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 27885 1726882527.99566: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 27885 1726882527.99614: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 27885 1726882527.99618: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff177c7c350> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177c7a480> <<< 27885 1726882527.99639: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 27885 1726882527.99672: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 27885 1726882527.99703: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 27885 1726882527.99712: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 27885 1726882527.99723: stdout chunk (state=3): >>>import '_string' # <<< 27885 1726882527.99757: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177c7fad0> <<< 27885 1726882527.99877: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177c7c4a0> <<< 27885 1726882527.99936: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff177c80e00> <<< 27885 1726882527.99967: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff177c7d010> <<< 27885 1726882528.00007: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff177c80c50> <<< 27885 1726882528.00023: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177c78530> <<< 27885 1726882528.00043: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 27885 1726882528.00063: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 27885 1726882528.00081: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 27885 1726882528.00113: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 27885 1726882528.00138: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 27885 1726882528.00143: stdout chunk (state=3): >>>import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff177b0c290> <<< 27885 1726882528.00284: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 27885 1726882528.00292: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff177b0d2e0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177c82a20> <<< 27885 1726882528.00326: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so'<<< 27885 1726882528.00338: stdout chunk (state=3): >>> import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff177c83dd0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177c82630> <<< 27885 1726882528.00349: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.00366: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 27885 1726882528.00380: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.00468: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.00551: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.00570: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 27885 1726882528.00597: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.00608: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 27885 1726882528.00615: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.00737: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.00852: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.01389: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.01911: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 27885 1726882528.01922: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 27885 1726882528.01932: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # <<< 27885 1726882528.01943: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 27885 1726882528.01965: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 27885 1726882528.02016: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 27885 1726882528.02020: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff177b11580> <<< 27885 1726882528.02102: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 27885 1726882528.02106: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 27885 1726882528.02124: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177b12330> <<< 27885 1726882528.02131: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177b0d4f0> <<< 27885 1726882528.02170: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 27885 1726882528.02185: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.02210: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.02218: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 27885 1726882528.02231: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.02376: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.02533: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 27885 1726882528.02542: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177b12120> <<< 27885 1726882528.02558: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.03007: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.03447: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.03519: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.03594: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 27885 1726882528.03600: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.03641: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.03673: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 27885 1726882528.03685: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.03751: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.03833: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 27885 1726882528.03836: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.03858: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 27885 1726882528.03878: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.03917: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.03956: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 27885 1726882528.03970: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.04196: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.04423: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 27885 1726882528.04479: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 27885 1726882528.04484: stdout chunk (state=3): >>>import '_ast' # <<< 27885 1726882528.04554: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177b13560> <<< 27885 1726882528.04560: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.04637: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.04721: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 27885 1726882528.04731: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 27885 1726882528.04740: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.04788: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.04821: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 27885 1726882528.04839: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.04875: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.04923: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.04978: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.05047: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 27885 1726882528.05078: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 27885 1726882528.05155: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff177b1e150> <<< 27885 1726882528.05192: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177b1bec0> <<< 27885 1726882528.05223: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 27885 1726882528.05228: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 27885 1726882528.05301: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.05359: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.05388: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.05432: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 27885 1726882528.05438: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 27885 1726882528.05451: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 27885 1726882528.05474: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 27885 1726882528.05491: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 27885 1726882528.05553: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 27885 1726882528.05567: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 27885 1726882528.05583: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 27885 1726882528.05636: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177c06b10> <<< 27885 1726882528.05678: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177cfe7e0> <<< 27885 1726882528.05755: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177b1e330> <<< 27885 1726882528.05761: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177b0d490> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 27885 1726882528.05797: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.05822: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 27885 1726882528.05826: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 27885 1726882528.05881: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 27885 1726882528.05896: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.05900: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.05918: stdout chunk (state=3): >>>import 'ansible.modules' # # zipimport: zlib available <<< 27885 1726882528.05980: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.06039: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.06060: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.06076: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.06124: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.06164: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.06203: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.06230: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 27885 1726882528.06249: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.06316: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.06388: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.06406: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.06443: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 27885 1726882528.06448: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.06623: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.06791: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.06835: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.06890: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py <<< 27885 1726882528.06903: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 27885 1726882528.06915: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 27885 1726882528.06929: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 27885 1726882528.06945: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 27885 1726882528.06970: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 27885 1726882528.06987: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177bb2150> <<< 27885 1726882528.07012: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 27885 1726882528.07022: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 27885 1726882528.07043: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 27885 1726882528.07077: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 27885 1726882528.07106: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 27885 1726882528.07109: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 27885 1726882528.07129: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1777bc110> <<< 27885 1726882528.07155: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 27885 1726882528.07172: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1777bc6b0> <<< 27885 1726882528.07229: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177b98050> <<< 27885 1726882528.07238: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177bb2cc0> <<< 27885 1726882528.07270: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177bb0830> <<< 27885 1726882528.07275: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177bb0470> <<< 27885 1726882528.07299: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 27885 1726882528.07335: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 27885 1726882528.07361: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 27885 1726882528.07366: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 27885 1726882528.07394: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py <<< 27885 1726882528.07397: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 27885 1726882528.07430: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' <<< 27885 1726882528.07438: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1777bf380> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1777bec30> <<< 27885 1726882528.07461: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' <<< 27885 1726882528.07472: stdout chunk (state=3): >>># extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1777bee10> <<< 27885 1726882528.07483: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1777be060><<< 27885 1726882528.07495: stdout chunk (state=3): >>> <<< 27885 1726882528.07503: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 27885 1726882528.07600: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 27885 1726882528.07609: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1777bf530> <<< 27885 1726882528.07627: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 27885 1726882528.07657: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 27885 1726882528.07685: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff177822060> <<< 27885 1726882528.07718: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1777bff80> <<< 27885 1726882528.07745: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177bb0500> <<< 27885 1726882528.07748: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.timeout' # <<< 27885 1726882528.07773: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # <<< 27885 1726882528.07778: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.07798: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 27885 1726882528.07806: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.07869: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.07922: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 27885 1726882528.07945: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.07987: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.08038: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 27885 1726882528.08054: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.08063: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 27885 1726882528.08083: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.08115: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.08146: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 27885 1726882528.08154: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.08210: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.08251: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 27885 1726882528.08254: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.08310: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.08342: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 27885 1726882528.08357: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.08413: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.08471: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.08529: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.08583: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 27885 1726882528.08597: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # <<< 27885 1726882528.08604: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.09060: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.09480: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 27885 1726882528.09486: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.09541: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.09595: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.09629: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.09663: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # <<< 27885 1726882528.09675: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.date_time' # <<< 27885 1726882528.09677: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.09711: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.09734: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 27885 1726882528.09747: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.09806: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.09858: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 27885 1726882528.09869: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.09904: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.09934: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 27885 1726882528.09937: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.09966: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.10002: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 27885 1726882528.10006: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.10087: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.10175: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 27885 1726882528.10183: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 27885 1726882528.10198: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177823d40> <<< 27885 1726882528.10221: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 27885 1726882528.10245: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 27885 1726882528.10366: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177822e10> import 'ansible.module_utils.facts.system.local' # <<< 27885 1726882528.10374: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.10440: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.10505: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 27885 1726882528.10510: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.10604: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.10694: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 27885 1726882528.10702: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.10762: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.10835: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 27885 1726882528.10839: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.10883: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.10933: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 27885 1726882528.10976: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 27885 1726882528.11042: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 27885 1726882528.11099: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff17785e420> <<< 27885 1726882528.11274: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff17784f080> <<< 27885 1726882528.11280: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 27885 1726882528.11347: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.11398: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 27885 1726882528.11413: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.11488: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.11570: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.11678: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.11826: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 27885 1726882528.11829: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.11870: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.11913: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 27885 1726882528.11919: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.11964: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.12012: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 27885 1726882528.12018: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 27885 1726882528.12038: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 27885 1726882528.12066: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff177872000> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177871f70> <<< 27885 1726882528.12084: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.user' # <<< 27885 1726882528.12088: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.12096: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.12111: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware' # <<< 27885 1726882528.12117: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.12152: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.12196: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 27885 1726882528.12201: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.12356: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.12506: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 27885 1726882528.12511: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.12610: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.12712: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.12751: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.12792: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 27885 1726882528.12809: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # <<< 27885 1726882528.12812: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.12829: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.12860: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.12992: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.13137: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 27885 1726882528.13146: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.13264: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.13383: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 27885 1726882528.13390: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.13429: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.13459: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.13989: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.14486: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 27885 1726882528.14495: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # <<< 27885 1726882528.14505: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.14605: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.14711: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 27885 1726882528.14717: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.14811: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.14913: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 27885 1726882528.14918: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.15076: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.15230: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 27885 1726882528.15237: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.15260: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.15263: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network' # <<< 27885 1726882528.15267: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.15316: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.15359: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 27885 1726882528.15364: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.15462: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.15561: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.15758: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.15956: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 27885 1726882528.15968: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # <<< 27885 1726882528.15972: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.16006: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.16044: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 27885 1726882528.16047: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.16075: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.16099: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 27885 1726882528.16113: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.16178: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.16249: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 27885 1726882528.16258: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.16280: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.16306: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 27885 1726882528.16314: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.16370: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.16428: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 27885 1726882528.16434: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.16496: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.16553: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 27885 1726882528.16562: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.16814: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.17067: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 27885 1726882528.17072: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.17135: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.17192: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 27885 1726882528.17209: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.17235: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.17274: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 27885 1726882528.17280: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.17320: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.17345: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 27885 1726882528.17363: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.17392: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.17425: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 27885 1726882528.17431: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.17513: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.17589: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 27885 1726882528.17620: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 27885 1726882528.17627: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual' # <<< 27885 1726882528.17632: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.17684: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.17723: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 27885 1726882528.17738: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.17759: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.17775: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.17826: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.17876: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.17943: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.18020: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 27885 1726882528.18027: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 27885 1726882528.18039: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.18091: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.18136: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 27885 1726882528.18150: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.18335: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.18525: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 27885 1726882528.18532: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.18581: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.18624: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 27885 1726882528.18638: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.18679: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.18728: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 27885 1726882528.18733: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.18817: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.18905: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # <<< 27885 1726882528.18908: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available <<< 27885 1726882528.19001: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.19087: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 27885 1726882528.19095: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 27885 1726882528.19159: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.19324: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 27885 1726882528.19349: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 27885 1726882528.19359: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 27885 1726882528.19400: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' <<< 27885 1726882528.19415: stdout chunk (state=3): >>>import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1776729f0> <<< 27885 1726882528.19417: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177673110> <<< 27885 1726882528.19461: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff17766a5a0> <<< 27885 1726882528.32301: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 27885 1726882528.32305: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' <<< 27885 1726882528.32308: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1776ba9c0> <<< 27885 1726882528.32349: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py <<< 27885 1726882528.32355: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 27885 1726882528.32378: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1776b9070> <<< 27885 1726882528.32438: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py <<< 27885 1726882528.32442: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 27885 1726882528.32468: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' <<< 27885 1726882528.32483: stdout chunk (state=3): >>>import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1776ba5d0> <<< 27885 1726882528.32516: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1776ba0c0> <<< 27885 1726882528.32741: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 27885 1726882528.57541: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-69", "ansible_nodename": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273daf4d79783f5cba36df2f56d9d0", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_loadavg": {"1m": 0.60205078125, "5m": 0.505859375, "15m": 0.27685546875}, "ansible_is_chroot": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDO9PZgr9JLdptbX1z24dINsp1ZUviCn2IFYUqfMM6j/uCKMg5pVfDr5EP5Ea09xR+KKjE9W6h445mjrxTxfVC3xCHR3VpSw3Oq+2ut1Ji+loZ+gygWU601w94ai/xsdgyml1uEyWaA+y3goILZNio8q0yQtVVMKaylDdwXYQ2zefxhpEJ2IlB2HJcJzSxCYz+Sa3mdkfG2DlXy2tqo95KEZ2m7lxzM1pkAHXup+mi3WaH4b4fHxNlRo8S/ebtmXiUYGjymQ5jck8sol0xo4LeBCRe0NKWBJZmK4X6N7Vwrb9tSp9rBJYxjQA9YCszz8i2C3Q33fP+kP2NUonq0NfFciCOt026ERL+ygggM392iXVJPF3VZfX1Pi3Z6B1PbuFZy/UE0SpwxHjWy+QRHd/SVa4YK0V3bMQ3T0bvGI2UuujjRvmDoob7j8Q4QkyY73p60sv4iob7xx/5BBlSagZNKbPiUWhOPXkHgYguuEWrbvoeQUPjhtCzQXguvY0Y6U18=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOkVDo8QW6ai2hAn3+uCY59f9/ff9I0xJwsgAdLmXdfM6LXa2YZqxM/XbCey2xlDC6ejVLDU0902Xq19HWz8n48=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIMO17OwTe9G3GI2fp+men+Q6jlxYO58zd3fpAMZ6aHgk", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_local": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fibre_channel_wwn": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], <<< 27885 1726882528.57581: stdout chunk (state=3): >>>"ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2960, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 571, "free": 2960}, "nocache": {"free": 3299, "used": 232}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273daf-4d79-783f-5cba-36df2f56d9d0", "ansible_product_uuid": "ec273daf-4d79-783f-5cba-36df2f56d9d0", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 718, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794603008, "block_size": 4096, "block_total": 65519099, "block_available": 63914698, "block_used": 1604401, "inode_total": 131070960, "inode_available": 131029051, "inode_used": 41909, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 35334 10.31.14.69 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 35334 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_interfaces": ["eth0", "rpltstbr", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist"<<< 27885 1726882528.57588: stdout chunk (state=3): >>>: "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c1:46:63:3b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.69", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c1ff:fe46:633b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "6e:57:f6:54:9a:30", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.69", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c1:46:63:3b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.69", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::8ff:c1ff:fe46:633b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.69", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::8ff:c1ff:fe46:633b"]}, "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "35", "second": "28", "epoch": "1726882528", "epoch_int": "1726882528", "date": "2024-09-20", "time": "21:35:28", "iso8601_micro": "2024-09-21T01:35:28.572289Z", "iso8601": "2024-09-21T01:35:28Z", "iso8601_basic": "20240920T213528572289", "iso8601_basic_short": "20240920T213528", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 27885 1726882528.58183: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 <<< 27885 1726882528.58198: stdout chunk (state=3): >>># clear sys.last_exc # clear sys.last_type <<< 27885 1726882528.58255: stdout chunk (state=3): >>># clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin <<< 27885 1726882528.58259: stdout chunk (state=3): >>># restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io <<< 27885 1726882528.58262: stdout chunk (state=3): >>># cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections<<< 27885 1726882528.58265: stdout chunk (state=3): >>> # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap <<< 27885 1726882528.58298: stdout chunk (state=3): >>># cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch <<< 27885 1726882528.58302: stdout chunk (state=3): >>># cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing <<< 27885 1726882528.58330: stdout chunk (state=3): >>># destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json <<< 27885 1726882528.58337: stdout chunk (state=3): >>># cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token <<< 27885 1726882528.58359: stdout chunk (state=3): >>># cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging <<< 27885 1726882528.58374: stdout chunk (state=3): >>># cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text <<< 27885 1726882528.58381: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes <<< 27885 1726882528.58384: stdout chunk (state=3): >>># destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing <<< 27885 1726882528.58416: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 <<< 27885 1726882528.58425: stdout chunk (state=3): >>># cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace <<< 27885 1726882528.58450: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection <<< 27885 1726882528.58457: stdout chunk (state=3): >>># cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python <<< 27885 1726882528.58486: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd <<< 27885 1726882528.58524: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter <<< 27885 1726882528.58530: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps <<< 27885 1726882528.58533: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network <<< 27885 1726882528.58543: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd <<< 27885 1726882528.58555: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 27885 1726882528.58888: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 27885 1726882528.58896: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util<<< 27885 1726882528.58899: stdout chunk (state=3): >>> <<< 27885 1726882528.58925: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 <<< 27885 1726882528.58933: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 27885 1726882528.58957: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob <<< 27885 1726882528.58963: stdout chunk (state=3): >>># destroy ipaddress <<< 27885 1726882528.58987: stdout chunk (state=3): >>># destroy ntpath <<< 27885 1726882528.59006: stdout chunk (state=3): >>># destroy importlib <<< 27885 1726882528.59031: stdout chunk (state=3): >>># destroy zipimport <<< 27885 1726882528.59035: stdout chunk (state=3): >>># destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner <<< 27885 1726882528.59040: stdout chunk (state=3): >>># destroy _json # destroy grp # destroy encodings <<< 27885 1726882528.59066: stdout chunk (state=3): >>># destroy _locale <<< 27885 1726882528.59073: stdout chunk (state=3): >>># destroy locale # destroy select # destroy _signal # destroy _posixsubprocess <<< 27885 1726882528.59079: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 27885 1726882528.59126: stdout chunk (state=3): >>># destroy selinux # destroy shutil <<< 27885 1726882528.59137: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 27885 1726882528.59187: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy<<< 27885 1726882528.59198: stdout chunk (state=3): >>> # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle <<< 27885 1726882528.59229: stdout chunk (state=3): >>># destroy _pickle # destroy queue # destroy _heapq # destroy _queue <<< 27885 1726882528.59236: stdout chunk (state=3): >>># destroy multiprocessing.reduction # destroy selectors <<< 27885 1726882528.59256: stdout chunk (state=3): >>># destroy shlex # destroy fcntl <<< 27885 1726882528.59279: stdout chunk (state=3): >>># destroy datetime <<< 27885 1726882528.59283: stdout chunk (state=3): >>># destroy subprocess # destroy base64 <<< 27885 1726882528.59299: stdout chunk (state=3): >>># destroy _ssl <<< 27885 1726882528.59321: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios <<< 27885 1726882528.59348: stdout chunk (state=3): >>># destroy json # destroy socket # destroy struct <<< 27885 1726882528.59389: stdout chunk (state=3): >>># destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection <<< 27885 1726882528.59408: stdout chunk (state=3): >>># destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection <<< 27885 1726882528.59465: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna <<< 27885 1726882528.59471: stdout chunk (state=3): >>># destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux <<< 27885 1726882528.59503: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 <<< 27885 1726882528.59558: stdout chunk (state=3): >>># cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections <<< 27885 1726882528.59564: stdout chunk (state=3): >>># cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat <<< 27885 1726882528.59600: stdout chunk (state=3): >>># cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 27885 1726882528.59781: stdout chunk (state=3): >>># destroy sys.monitoring <<< 27885 1726882528.59802: stdout chunk (state=3): >>># destroy _socket # destroy _collections <<< 27885 1726882528.59859: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing <<< 27885 1726882528.59879: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 27885 1726882528.59913: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 27885 1726882528.60020: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit <<< 27885 1726882528.60029: stdout chunk (state=3): >>># destroy _warnings # destroy math # destroy _bisect # destroy time <<< 27885 1726882528.60108: stdout chunk (state=3): >>># destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools <<< 27885 1726882528.60112: stdout chunk (state=3): >>># destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 27885 1726882528.60599: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 27885 1726882528.60603: stdout chunk (state=3): >>><<< 27885 1726882528.60605: stderr chunk (state=3): >>><<< 27885 1726882528.60650: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1787bc4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff17878bb00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1787bea50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1787cd130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1787cdfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1785ebda0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1785ebfe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1786237a0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff178623e30> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff178603a70> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff178601190> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1785e8f50> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff178643710> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff178642330> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff178602060> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1785ea810> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1786787a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1785e81d0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff178678c50> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff178678b00> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff178678ec0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1785e6cf0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1786795b0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff178679280> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff17867a4b0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1786906e0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff178691df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff178692c60> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1786932c0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1786921b0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff178693d40> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff178693470> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff17867a510> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff178393b90> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1783bc620> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1783bc3b0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1783bc650> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1783bcf80> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1783bd970> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1783bc830> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff178391d30> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1783bed20> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1783bda90> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff17867ac00> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1783eb080> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff17840b410> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff17846c1d0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff17846e930> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff17846c2f0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1784391f0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177d292e0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff17840a240> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1783bfc50> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7ff17840a330> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_jk0z8lda/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177d8afc0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177d69eb0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177d69010> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177d88e90> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff177dbe9f0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177dbe780> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177dbe090> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177dbe4e0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177d8bc50> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff177dbf770> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff177dbf9b0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177dbfef0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177c25d00> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff177c27920> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177c282f0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177c29490> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177c2bf80> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff177c302c0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177c2a240> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177c33fb0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177c32a80> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177c327e0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177c32d50> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177c2a750> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff177c78200> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177c783b0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff177c79e50> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177c79be0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff177c7c350> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177c7a480> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177c7fad0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177c7c4a0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff177c80e00> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff177c7d010> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff177c80c50> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177c78530> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff177b0c290> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff177b0d2e0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177c82a20> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff177c83dd0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177c82630> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff177b11580> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177b12330> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177b0d4f0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177b12120> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177b13560> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff177b1e150> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177b1bec0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177c06b10> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177cfe7e0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177b1e330> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177b0d490> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177bb2150> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1777bc110> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1777bc6b0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177b98050> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177bb2cc0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177bb0830> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177bb0470> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1777bf380> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1777bec30> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1777bee10> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1777be060> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1777bf530> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff177822060> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1777bff80> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177bb0500> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177823d40> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177822e10> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff17785e420> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff17784f080> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff177872000> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177871f70> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1776729f0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff177673110> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff17766a5a0> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1776ba9c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1776b9070> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1776ba5d0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1776ba0c0> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-69", "ansible_nodename": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273daf4d79783f5cba36df2f56d9d0", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_loadavg": {"1m": 0.60205078125, "5m": 0.505859375, "15m": 0.27685546875}, "ansible_is_chroot": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDO9PZgr9JLdptbX1z24dINsp1ZUviCn2IFYUqfMM6j/uCKMg5pVfDr5EP5Ea09xR+KKjE9W6h445mjrxTxfVC3xCHR3VpSw3Oq+2ut1Ji+loZ+gygWU601w94ai/xsdgyml1uEyWaA+y3goILZNio8q0yQtVVMKaylDdwXYQ2zefxhpEJ2IlB2HJcJzSxCYz+Sa3mdkfG2DlXy2tqo95KEZ2m7lxzM1pkAHXup+mi3WaH4b4fHxNlRo8S/ebtmXiUYGjymQ5jck8sol0xo4LeBCRe0NKWBJZmK4X6N7Vwrb9tSp9rBJYxjQA9YCszz8i2C3Q33fP+kP2NUonq0NfFciCOt026ERL+ygggM392iXVJPF3VZfX1Pi3Z6B1PbuFZy/UE0SpwxHjWy+QRHd/SVa4YK0V3bMQ3T0bvGI2UuujjRvmDoob7j8Q4QkyY73p60sv4iob7xx/5BBlSagZNKbPiUWhOPXkHgYguuEWrbvoeQUPjhtCzQXguvY0Y6U18=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOkVDo8QW6ai2hAn3+uCY59f9/ff9I0xJwsgAdLmXdfM6LXa2YZqxM/XbCey2xlDC6ejVLDU0902Xq19HWz8n48=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIMO17OwTe9G3GI2fp+men+Q6jlxYO58zd3fpAMZ6aHgk", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_local": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fibre_channel_wwn": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2960, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 571, "free": 2960}, "nocache": {"free": 3299, "used": 232}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273daf-4d79-783f-5cba-36df2f56d9d0", "ansible_product_uuid": "ec273daf-4d79-783f-5cba-36df2f56d9d0", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 718, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794603008, "block_size": 4096, "block_total": 65519099, "block_available": 63914698, "block_used": 1604401, "inode_total": 131070960, "inode_available": 131029051, "inode_used": 41909, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 35334 10.31.14.69 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 35334 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_interfaces": ["eth0", "rpltstbr", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c1:46:63:3b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.69", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c1ff:fe46:633b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "6e:57:f6:54:9a:30", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.69", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c1:46:63:3b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.69", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::8ff:c1ff:fe46:633b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.69", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::8ff:c1ff:fe46:633b"]}, "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "35", "second": "28", "epoch": "1726882528", "epoch_int": "1726882528", "date": "2024-09-20", "time": "21:35:28", "iso8601_micro": "2024-09-21T01:35:28.572289Z", "iso8601": "2024-09-21T01:35:28Z", "iso8601_basic": "20240920T213528572289", "iso8601_basic_short": "20240920T213528", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node2 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 27885 1726882528.62982: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882527.4166353-27914-246816898720461/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27885 1726882528.63028: _low_level_execute_command(): starting 27885 1726882528.63039: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882527.4166353-27914-246816898720461/ > /dev/null 2>&1 && sleep 0' 27885 1726882528.63598: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882528.63601: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882528.63603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 27885 1726882528.63605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882528.63607: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882528.63609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 27885 1726882528.63611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882528.63657: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882528.63660: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882528.63667: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882528.63728: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882528.65581: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882528.65584: stdout chunk (state=3): >>><<< 27885 1726882528.65595: stderr chunk (state=3): >>><<< 27885 1726882528.65610: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882528.65616: handler run complete 27885 1726882528.65694: variable 'ansible_facts' from source: unknown 27885 1726882528.65757: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882528.65960: variable 'ansible_facts' from source: unknown 27885 1726882528.66015: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882528.66098: attempt loop complete, returning result 27885 1726882528.66102: _execute() done 27885 1726882528.66104: dumping result to json 27885 1726882528.66125: done dumping result, returning 27885 1726882528.66136: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [12673a56-9f93-3fa5-01be-0000000000bf] 27885 1726882528.66139: sending task result for task 12673a56-9f93-3fa5-01be-0000000000bf ok: [managed_node2] 27885 1726882528.66752: no more pending results, returning what we have 27885 1726882528.66754: results queue empty 27885 1726882528.66755: checking for any_errors_fatal 27885 1726882528.66756: done checking for any_errors_fatal 27885 1726882528.66756: checking for max_fail_percentage 27885 1726882528.66757: done checking for max_fail_percentage 27885 1726882528.66757: checking to see if all hosts have failed and the running result is not ok 27885 1726882528.66758: done checking to see if all hosts have failed 27885 1726882528.66758: getting the remaining hosts for this loop 27885 1726882528.66759: done getting the remaining hosts for this loop 27885 1726882528.66762: getting the next task for host managed_node2 27885 1726882528.66766: done getting next task for host managed_node2 27885 1726882528.66767: ^ task is: TASK: meta (flush_handlers) 27885 1726882528.66768: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882528.66771: getting variables 27885 1726882528.66772: in VariableManager get_vars() 27885 1726882528.66786: Calling all_inventory to load vars for managed_node2 27885 1726882528.66788: Calling groups_inventory to load vars for managed_node2 27885 1726882528.66792: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882528.66801: Calling all_plugins_play to load vars for managed_node2 27885 1726882528.66802: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882528.66805: Calling groups_plugins_play to load vars for managed_node2 27885 1726882528.66920: done sending task result for task 12673a56-9f93-3fa5-01be-0000000000bf 27885 1726882528.66924: WORKER PROCESS EXITING 27885 1726882528.66935: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882528.67061: done with get_vars() 27885 1726882528.67068: done getting variables 27885 1726882528.67117: in VariableManager get_vars() 27885 1726882528.67123: Calling all_inventory to load vars for managed_node2 27885 1726882528.67125: Calling groups_inventory to load vars for managed_node2 27885 1726882528.67126: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882528.67129: Calling all_plugins_play to load vars for managed_node2 27885 1726882528.67131: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882528.67132: Calling groups_plugins_play to load vars for managed_node2 27885 1726882528.67220: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882528.67333: done with get_vars() 27885 1726882528.67342: done queuing things up, now waiting for results queue to drain 27885 1726882528.67343: results queue empty 27885 1726882528.67343: checking for any_errors_fatal 27885 1726882528.67348: done checking for any_errors_fatal 27885 1726882528.67349: checking for max_fail_percentage 27885 1726882528.67349: done checking for max_fail_percentage 27885 1726882528.67350: checking to see if all hosts have failed and the running result is not ok 27885 1726882528.67350: done checking to see if all hosts have failed 27885 1726882528.67350: getting the remaining hosts for this loop 27885 1726882528.67351: done getting the remaining hosts for this loop 27885 1726882528.67353: getting the next task for host managed_node2 27885 1726882528.67356: done getting next task for host managed_node2 27885 1726882528.67357: ^ task is: TASK: Include the task 'el_repo_setup.yml' 27885 1726882528.67358: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882528.67359: getting variables 27885 1726882528.67360: in VariableManager get_vars() 27885 1726882528.67366: Calling all_inventory to load vars for managed_node2 27885 1726882528.67368: Calling groups_inventory to load vars for managed_node2 27885 1726882528.67369: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882528.67372: Calling all_plugins_play to load vars for managed_node2 27885 1726882528.67374: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882528.67375: Calling groups_plugins_play to load vars for managed_node2 27885 1726882528.67484: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882528.67614: done with get_vars() 27885 1726882528.67620: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_device_nm.yml:11 Friday 20 September 2024 21:35:28 -0400 (0:00:01.308) 0:00:01.319 ****** 27885 1726882528.67666: entering _queue_task() for managed_node2/include_tasks 27885 1726882528.67668: Creating lock for include_tasks 27885 1726882528.67872: worker is 1 (out of 1 available) 27885 1726882528.67884: exiting _queue_task() for managed_node2/include_tasks 27885 1726882528.67900: done queuing things up, now waiting for results queue to drain 27885 1726882528.67902: waiting for pending results... 27885 1726882528.68039: running TaskExecutor() for managed_node2/TASK: Include the task 'el_repo_setup.yml' 27885 1726882528.68092: in run() - task 12673a56-9f93-3fa5-01be-000000000006 27885 1726882528.68103: variable 'ansible_search_path' from source: unknown 27885 1726882528.68136: calling self._execute() 27885 1726882528.68179: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882528.68185: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882528.68197: variable 'omit' from source: magic vars 27885 1726882528.68268: _execute() done 27885 1726882528.68272: dumping result to json 27885 1726882528.68274: done dumping result, returning 27885 1726882528.68280: done running TaskExecutor() for managed_node2/TASK: Include the task 'el_repo_setup.yml' [12673a56-9f93-3fa5-01be-000000000006] 27885 1726882528.68287: sending task result for task 12673a56-9f93-3fa5-01be-000000000006 27885 1726882528.68375: done sending task result for task 12673a56-9f93-3fa5-01be-000000000006 27885 1726882528.68377: WORKER PROCESS EXITING 27885 1726882528.68425: no more pending results, returning what we have 27885 1726882528.68429: in VariableManager get_vars() 27885 1726882528.68452: Calling all_inventory to load vars for managed_node2 27885 1726882528.68454: Calling groups_inventory to load vars for managed_node2 27885 1726882528.68457: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882528.68465: Calling all_plugins_play to load vars for managed_node2 27885 1726882528.68467: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882528.68470: Calling groups_plugins_play to load vars for managed_node2 27885 1726882528.68585: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882528.68701: done with get_vars() 27885 1726882528.68706: variable 'ansible_search_path' from source: unknown 27885 1726882528.68716: we have included files to process 27885 1726882528.68716: generating all_blocks data 27885 1726882528.68717: done generating all_blocks data 27885 1726882528.68718: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 27885 1726882528.68718: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 27885 1726882528.68720: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 27885 1726882528.69139: in VariableManager get_vars() 27885 1726882528.69148: done with get_vars() 27885 1726882528.69155: done processing included file 27885 1726882528.69156: iterating over new_blocks loaded from include file 27885 1726882528.69158: in VariableManager get_vars() 27885 1726882528.69164: done with get_vars() 27885 1726882528.69165: filtering new block on tags 27885 1726882528.69174: done filtering new block on tags 27885 1726882528.69176: in VariableManager get_vars() 27885 1726882528.69181: done with get_vars() 27885 1726882528.69182: filtering new block on tags 27885 1726882528.69191: done filtering new block on tags 27885 1726882528.69194: in VariableManager get_vars() 27885 1726882528.69200: done with get_vars() 27885 1726882528.69201: filtering new block on tags 27885 1726882528.69208: done filtering new block on tags 27885 1726882528.69209: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node2 27885 1726882528.69213: extending task lists for all hosts with included blocks 27885 1726882528.69241: done extending task lists 27885 1726882528.69242: done processing included files 27885 1726882528.69242: results queue empty 27885 1726882528.69242: checking for any_errors_fatal 27885 1726882528.69243: done checking for any_errors_fatal 27885 1726882528.69244: checking for max_fail_percentage 27885 1726882528.69244: done checking for max_fail_percentage 27885 1726882528.69245: checking to see if all hosts have failed and the running result is not ok 27885 1726882528.69245: done checking to see if all hosts have failed 27885 1726882528.69245: getting the remaining hosts for this loop 27885 1726882528.69246: done getting the remaining hosts for this loop 27885 1726882528.69247: getting the next task for host managed_node2 27885 1726882528.69250: done getting next task for host managed_node2 27885 1726882528.69251: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 27885 1726882528.69253: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882528.69254: getting variables 27885 1726882528.69255: in VariableManager get_vars() 27885 1726882528.69259: Calling all_inventory to load vars for managed_node2 27885 1726882528.69260: Calling groups_inventory to load vars for managed_node2 27885 1726882528.69262: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882528.69265: Calling all_plugins_play to load vars for managed_node2 27885 1726882528.69267: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882528.69269: Calling groups_plugins_play to load vars for managed_node2 27885 1726882528.69363: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882528.69476: done with get_vars() 27885 1726882528.69482: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 21:35:28 -0400 (0:00:00.018) 0:00:01.337 ****** 27885 1726882528.69528: entering _queue_task() for managed_node2/setup 27885 1726882528.69690: worker is 1 (out of 1 available) 27885 1726882528.69701: exiting _queue_task() for managed_node2/setup 27885 1726882528.69712: done queuing things up, now waiting for results queue to drain 27885 1726882528.69714: waiting for pending results... 27885 1726882528.69854: running TaskExecutor() for managed_node2/TASK: Gather the minimum subset of ansible_facts required by the network role test 27885 1726882528.69912: in run() - task 12673a56-9f93-3fa5-01be-0000000000d0 27885 1726882528.69921: variable 'ansible_search_path' from source: unknown 27885 1726882528.69924: variable 'ansible_search_path' from source: unknown 27885 1726882528.69953: calling self._execute() 27885 1726882528.70005: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882528.70009: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882528.70019: variable 'omit' from source: magic vars 27885 1726882528.70363: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27885 1726882528.71733: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27885 1726882528.71774: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27885 1726882528.71806: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27885 1726882528.71838: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27885 1726882528.71858: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27885 1726882528.71918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882528.71938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882528.71954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882528.71979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882528.71989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882528.72108: variable 'ansible_facts' from source: unknown 27885 1726882528.72152: variable 'network_test_required_facts' from source: task vars 27885 1726882528.72178: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 27885 1726882528.72181: variable 'omit' from source: magic vars 27885 1726882528.72208: variable 'omit' from source: magic vars 27885 1726882528.72234: variable 'omit' from source: magic vars 27885 1726882528.72251: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882528.72271: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882528.72284: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882528.72301: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882528.72309: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882528.72335: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882528.72338: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882528.72341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882528.72403: Set connection var ansible_pipelining to False 27885 1726882528.72406: Set connection var ansible_connection to ssh 27885 1726882528.72412: Set connection var ansible_timeout to 10 27885 1726882528.72414: Set connection var ansible_shell_type to sh 27885 1726882528.72420: Set connection var ansible_shell_executable to /bin/sh 27885 1726882528.72424: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882528.72443: variable 'ansible_shell_executable' from source: unknown 27885 1726882528.72446: variable 'ansible_connection' from source: unknown 27885 1726882528.72449: variable 'ansible_module_compression' from source: unknown 27885 1726882528.72451: variable 'ansible_shell_type' from source: unknown 27885 1726882528.72453: variable 'ansible_shell_executable' from source: unknown 27885 1726882528.72455: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882528.72457: variable 'ansible_pipelining' from source: unknown 27885 1726882528.72461: variable 'ansible_timeout' from source: unknown 27885 1726882528.72465: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882528.72558: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 27885 1726882528.72565: variable 'omit' from source: magic vars 27885 1726882528.72570: starting attempt loop 27885 1726882528.72573: running the handler 27885 1726882528.72584: _low_level_execute_command(): starting 27885 1726882528.72590: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27885 1726882528.73073: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882528.73077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882528.73081: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882528.73083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 27885 1726882528.73085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882528.73136: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882528.73139: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882528.73142: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882528.73214: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882528.74792: stdout chunk (state=3): >>>/root <<< 27885 1726882528.74890: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882528.74924: stderr chunk (state=3): >>><<< 27885 1726882528.74927: stdout chunk (state=3): >>><<< 27885 1726882528.74943: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882528.74954: _low_level_execute_command(): starting 27885 1726882528.74958: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882528.7494233-27956-73338112538767 `" && echo ansible-tmp-1726882528.7494233-27956-73338112538767="` echo /root/.ansible/tmp/ansible-tmp-1726882528.7494233-27956-73338112538767 `" ) && sleep 0' 27885 1726882528.75374: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882528.75377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882528.75379: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882528.75381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 27885 1726882528.75383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882528.75433: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882528.75436: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882528.75508: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882528.77381: stdout chunk (state=3): >>>ansible-tmp-1726882528.7494233-27956-73338112538767=/root/.ansible/tmp/ansible-tmp-1726882528.7494233-27956-73338112538767 <<< 27885 1726882528.77485: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882528.77511: stderr chunk (state=3): >>><<< 27885 1726882528.77514: stdout chunk (state=3): >>><<< 27885 1726882528.77578: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882528.7494233-27956-73338112538767=/root/.ansible/tmp/ansible-tmp-1726882528.7494233-27956-73338112538767 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882528.77582: variable 'ansible_module_compression' from source: unknown 27885 1726882528.77632: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27885_1t3g5hz/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 27885 1726882528.77679: variable 'ansible_facts' from source: unknown 27885 1726882528.77812: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882528.7494233-27956-73338112538767/AnsiballZ_setup.py 27885 1726882528.77904: Sending initial data 27885 1726882528.77907: Sent initial data (153 bytes) 27885 1726882528.78328: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882528.78332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 27885 1726882528.78334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882528.78336: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882528.78338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882528.78387: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882528.78390: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882528.78461: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882528.79976: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27885 1726882528.80035: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27885 1726882528.80100: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmphhmjtalj /root/.ansible/tmp/ansible-tmp-1726882528.7494233-27956-73338112538767/AnsiballZ_setup.py <<< 27885 1726882528.80103: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882528.7494233-27956-73338112538767/AnsiballZ_setup.py" <<< 27885 1726882528.80158: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmphhmjtalj" to remote "/root/.ansible/tmp/ansible-tmp-1726882528.7494233-27956-73338112538767/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882528.7494233-27956-73338112538767/AnsiballZ_setup.py" <<< 27885 1726882528.81316: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882528.81348: stderr chunk (state=3): >>><<< 27885 1726882528.81352: stdout chunk (state=3): >>><<< 27885 1726882528.81365: done transferring module to remote 27885 1726882528.81375: _low_level_execute_command(): starting 27885 1726882528.81382: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882528.7494233-27956-73338112538767/ /root/.ansible/tmp/ansible-tmp-1726882528.7494233-27956-73338112538767/AnsiballZ_setup.py && sleep 0' 27885 1726882528.81908: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882528.81953: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882528.81957: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882528.82310: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882528.83881: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882528.83908: stderr chunk (state=3): >>><<< 27885 1726882528.83914: stdout chunk (state=3): >>><<< 27885 1726882528.83969: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882528.83973: _low_level_execute_command(): starting 27885 1726882528.83975: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882528.7494233-27956-73338112538767/AnsiballZ_setup.py && sleep 0' 27885 1726882528.84292: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882528.84308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 27885 1726882528.84322: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882528.84363: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882528.84375: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882528.84448: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882528.86637: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 27885 1726882528.86744: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 27885 1726882528.86777: stdout chunk (state=3): >>>import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook <<< 27885 1726882528.86880: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # <<< 27885 1726882528.86906: stdout chunk (state=3): >>>import 'codecs' # <<< 27885 1726882528.86936: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 27885 1726882528.87014: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 27885 1726882528.87032: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec888184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec887e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec8881aa50> <<< 27885 1726882528.87061: stdout chunk (state=3): >>>import '_signal' # import '_abc' # <<< 27885 1726882528.87207: stdout chunk (state=3): >>>import 'abc' # <<< 27885 1726882528.87234: stdout chunk (state=3): >>>import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # <<< 27885 1726882528.87264: stdout chunk (state=3): >>>import 'os' # <<< 27885 1726882528.87306: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' <<< 27885 1726882528.87333: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 27885 1726882528.87357: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 27885 1726882528.87411: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec885c9130> <<< 27885 1726882528.87446: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 27885 1726882528.87488: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec885c9fa0> <<< 27885 1726882528.87632: stdout chunk (state=3): >>>import 'site' # <<< 27885 1726882528.87636: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 27885 1726882528.87891: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 27885 1726882528.87913: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 27885 1726882528.87941: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 27885 1726882528.87957: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 27885 1726882528.87988: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 27885 1726882528.88008: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 27885 1726882528.88040: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec88607e60> <<< 27885 1726882528.88107: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 27885 1726882528.88144: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec88607f20> <<< 27885 1726882528.88170: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 27885 1726882528.88192: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 27885 1726882528.88229: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 27885 1726882528.88244: stdout chunk (state=3): >>>import 'itertools' # <<< 27885 1726882528.88275: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' <<< 27885 1726882528.88318: stdout chunk (state=3): >>>import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec8863f890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 27885 1726882528.88329: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec8863ff20> import '_collections' # <<< 27885 1726882528.88394: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec8861fb30> <<< 27885 1726882528.88398: stdout chunk (state=3): >>>import '_functools' # <<< 27885 1726882528.88424: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec8861d250> <<< 27885 1726882528.88511: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec88605010> <<< 27885 1726882528.88540: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 27885 1726882528.88575: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 27885 1726882528.88578: stdout chunk (state=3): >>>import '_sre' # <<< 27885 1726882528.88606: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 27885 1726882528.88627: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 27885 1726882528.88646: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 27885 1726882528.88664: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 27885 1726882528.88687: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec8865f800> <<< 27885 1726882528.88705: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec8865e450> <<< 27885 1726882528.88732: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec8861e120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec8865ccb0> <<< 27885 1726882528.88786: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 27885 1726882528.88806: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec88694860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec88604290> <<< 27885 1726882528.88846: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 27885 1726882528.88869: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec88694d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec88694bc0><<< 27885 1726882528.88886: stdout chunk (state=3): >>> <<< 27885 1726882528.88914: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec88694fb0> <<< 27885 1726882528.88924: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec88602db0> <<< 27885 1726882528.88954: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 27885 1726882528.88979: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 27885 1726882528.88999: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 27885 1726882528.89028: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec886956a0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec88695370> import 'importlib.machinery' # <<< 27885 1726882528.89058: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py <<< 27885 1726882528.89088: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 27885 1726882528.89107: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec886965a0> import 'importlib.util' # import 'runpy' # <<< 27885 1726882528.89130: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 27885 1726882528.89173: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 27885 1726882528.89210: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' <<< 27885 1726882528.89218: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec886ac7a0> import 'errno' # <<< 27885 1726882528.89243: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec886ade80> <<< 27885 1726882528.89271: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 27885 1726882528.89300: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 27885 1726882528.89326: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 27885 1726882528.89329: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec886aed20> <<< 27885 1726882528.89361: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec886af320> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec886ae270> <<< 27885 1726882528.89391: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 27885 1726882528.89403: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 27885 1726882528.89455: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec886afda0> <<< 27885 1726882528.89458: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec886af4d0> <<< 27885 1726882528.89514: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec88696510> <<< 27885 1726882528.89520: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 27885 1726882528.89560: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 27885 1726882528.89564: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 27885 1726882528.89592: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 27885 1726882528.89617: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec883a3bf0> <<< 27885 1726882528.89646: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 27885 1726882528.89677: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec883cc740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec883cc4a0> <<< 27885 1726882528.89726: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec883cc680> <<< 27885 1726882528.89737: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 27885 1726882528.89850: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 27885 1726882528.90160: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec883ccfe0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec883cd910> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec883cc8c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec883a1d90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 27885 1726882528.90166: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 27885 1726882528.90171: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec883ced20> <<< 27885 1726882528.90201: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec883cda60> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec88696750> <<< 27885 1726882528.90231: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 27885 1726882528.90284: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 27885 1726882528.90311: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 27885 1726882528.90339: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 27885 1726882528.90442: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec883f7080> <<< 27885 1726882528.90469: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 27885 1726882528.90512: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec8841b440> <<< 27885 1726882528.90546: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 27885 1726882528.90576: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 27885 1726882528.90629: stdout chunk (state=3): >>>import 'ntpath' # <<< 27885 1726882528.90758: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec8847c230> <<< 27885 1726882528.90788: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 27885 1726882528.90852: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec8847e990> <<< 27885 1726882528.90926: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec8847c350> <<< 27885 1726882528.91011: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec88449250> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87d29310> <<< 27885 1726882528.91023: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec8841a240> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec883cfc50> <<< 27885 1726882528.91184: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 27885 1726882528.91206: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fec87d295b0> <<< 27885 1726882528.91507: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_nv6k951j/ansible_setup_payload.zip' # zipimport: zlib available <<< 27885 1726882528.91672: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 27885 1726882528.91735: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 27885 1726882528.91777: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87d92f90> <<< 27885 1726882528.91789: stdout chunk (state=3): >>>import '_typing' # <<< 27885 1726882528.91959: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87d71e80> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87d710a0> <<< 27885 1726882528.91992: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.92054: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 27885 1726882528.92068: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available <<< 27885 1726882528.93509: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.94591: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87d91280> <<< 27885 1726882528.94630: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 27885 1726882528.94660: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 27885 1726882528.94689: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 27885 1726882528.94719: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 27885 1726882528.94737: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec87dc2990> <<< 27885 1726882528.94753: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87dc2720> <<< 27885 1726882528.94786: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87dc2030> <<< 27885 1726882528.94820: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 27885 1726882528.94855: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87dc2480> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87d93c20> <<< 27885 1726882528.94879: stdout chunk (state=3): >>>import 'atexit' # <<< 27885 1726882528.94898: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec87dc3770> <<< 27885 1726882528.94930: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec87dc3980> <<< 27885 1726882528.94955: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 27885 1726882528.94987: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 27885 1726882528.95000: stdout chunk (state=3): >>>import '_locale' # <<< 27885 1726882528.95052: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87dc3e90> import 'pwd' # <<< 27885 1726882528.95082: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 27885 1726882528.95095: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 27885 1726882528.95122: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87c2dca0> <<< 27885 1726882528.95172: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 27885 1726882528.95203: stdout chunk (state=3): >>>import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec87c2f890> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 27885 1726882528.95207: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 27885 1726882528.95248: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87c30290> <<< 27885 1726882528.95253: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 27885 1726882528.95309: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 27885 1726882528.95312: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87c31400> <<< 27885 1726882528.95324: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 27885 1726882528.95344: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 27885 1726882528.95378: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 27885 1726882528.95426: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87c33e90> <<< 27885 1726882528.95461: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec88602ea0> <<< 27885 1726882528.95481: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87c32180> <<< 27885 1726882528.95515: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 27885 1726882528.95537: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 27885 1726882528.95565: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 27885 1726882528.95576: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 27885 1726882528.95674: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 27885 1726882528.95694: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 27885 1726882528.95721: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87c3be60> import '_tokenize' # <<< 27885 1726882528.95787: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87c3a930> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87c3a690> <<< 27885 1726882528.95808: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 27885 1726882528.95880: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87c3ac00> <<< 27885 1726882528.95927: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87c32660> <<< 27885 1726882528.95943: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec87c7ff80> <<< 27885 1726882528.95977: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87c80260> <<< 27885 1726882528.96004: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 27885 1726882528.96030: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 27885 1726882528.96060: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 27885 1726882528.96088: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec87c81d00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87c81ac0> <<< 27885 1726882528.96119: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 27885 1726882528.96122: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 27885 1726882528.96183: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec87c84290> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87c823c0> <<< 27885 1726882528.96187: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 27885 1726882528.96239: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 27885 1726882528.96269: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 27885 1726882528.96313: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 27885 1726882528.96317: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87c87a70> <<< 27885 1726882528.97005: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87c84440> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec87c887d0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec87c88c20> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec87c88ce0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87c80470> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec87b10380> <<< 27885 1726882528.97009: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec87b115e0> <<< 27885 1726882528.97011: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87c8ab40> <<< 27885 1726882528.97014: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 27885 1726882528.97016: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec87c8bec0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87c8a780> <<< 27885 1726882528.97018: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available <<< 27885 1726882528.97116: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 27885 1726882528.97142: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # # zipimport: zlib available <<< 27885 1726882528.97169: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 27885 1726882528.97294: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.97411: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.97933: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.98474: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 27885 1726882528.98500: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 27885 1726882528.98528: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 27885 1726882528.98539: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 27885 1726882528.98583: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec87b19730> <<< 27885 1726882528.98661: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 27885 1726882528.98691: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87b1a4b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87b11460> <<< 27885 1726882528.98737: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 27885 1726882528.98760: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.98778: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.98806: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 27885 1726882528.98942: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.99089: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 27885 1726882528.99118: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87b1a480> <<< 27885 1726882528.99137: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882528.99584: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.00019: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.00086: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.00165: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 27885 1726882529.00179: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.00202: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.00245: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 27885 1726882529.00319: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.00438: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 27885 1726882529.00442: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 27885 1726882529.00454: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 27885 1726882529.00487: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.00520: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 27885 1726882529.00547: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.00761: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.01003: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 27885 1726882529.01057: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 27885 1726882529.01060: stdout chunk (state=3): >>>import '_ast' # <<< 27885 1726882529.01231: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87b1b680> <<< 27885 1726882529.01234: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.01237: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.01272: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # <<< 27885 1726882529.01297: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 27885 1726882529.01438: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.01441: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available <<< 27885 1726882529.01487: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.01546: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.01616: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 27885 1726882529.01648: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 27885 1726882529.01719: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec87b26030> <<< 27885 1726882529.01767: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87b21070> <<< 27885 1726882529.01867: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 27885 1726882529.01886: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 27885 1726882529.01925: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.01952: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.02003: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 27885 1726882529.02020: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 27885 1726882529.02048: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 27885 1726882529.02064: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 27885 1726882529.02125: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 27885 1726882529.02159: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 27885 1726882529.02162: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 27885 1726882529.02210: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87c0ea20> <<< 27885 1726882529.02256: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87dee6f0> <<< 27885 1726882529.02336: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87b26150> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87b1b140> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 27885 1726882529.02365: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.02378: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.02420: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 27885 1726882529.02479: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 27885 1726882529.02506: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.02510: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # <<< 27885 1726882529.02540: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.02564: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.02667: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.02671: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.02681: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.02714: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.02779: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.02795: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.02833: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 27885 1726882529.02843: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.02913: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.02981: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.03004: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.03044: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 27885 1726882529.03047: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.03214: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.03387: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.03424: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.03482: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 27885 1726882529.03520: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 27885 1726882529.03548: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 27885 1726882529.03575: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 27885 1726882529.03604: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87bb6360> <<< 27885 1726882529.03645: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 27885 1726882529.03692: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 27885 1726882529.03717: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec877d01a0> <<< 27885 1726882529.03797: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 27885 1726882529.03947: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec877d0590> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87ba3350> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87bb6ea0> <<< 27885 1726882529.03952: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87bb4a10> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87bb4620> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 27885 1726882529.03983: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 27885 1726882529.04014: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec877d3470> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec877d2d20> <<< 27885 1726882529.04070: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec877d2f00> <<< 27885 1726882529.04087: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec877d2150> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 27885 1726882529.04323: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec877d34d0> <<< 27885 1726882529.04331: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 27885 1726882529.04376: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec8782e000> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec877d3fe0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87bb4740> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 27885 1726882529.04379: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.04439: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.04485: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 27885 1726882529.04513: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.04560: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.04660: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 27885 1726882529.04677: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.04742: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 27885 1726882529.04767: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.04855: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 27885 1726882529.04858: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.04873: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.05136: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available <<< 27885 1726882529.05151: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 27885 1726882529.05608: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.06036: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 27885 1726882529.06119: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 27885 1726882529.06149: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.06185: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.06209: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # <<< 27885 1726882529.06238: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 27885 1726882529.06258: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.06297: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 27885 1726882529.06313: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.06360: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.06411: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 27885 1726882529.06434: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.06459: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.06492: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 27885 1726882529.06510: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.06534: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.06560: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 27885 1726882529.06583: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.06652: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.06739: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 27885 1726882529.06768: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec8782fe60> <<< 27885 1726882529.06796: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 27885 1726882529.06810: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 27885 1726882529.07009: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec8782ec90> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 27885 1726882529.07013: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.07082: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 27885 1726882529.07088: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.07175: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.07265: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 27885 1726882529.07276: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.07338: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.07414: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available <<< 27885 1726882529.07461: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.07508: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 27885 1726882529.07549: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 27885 1726882529.07612: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 27885 1726882529.07673: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec8786e360> <<< 27885 1726882529.07853: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec8785f140> import 'ansible.module_utils.facts.system.python' # <<< 27885 1726882529.07871: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.07924: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.07995: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 27885 1726882529.07998: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.08070: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.08156: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.08265: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.08423: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 27885 1726882529.08443: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.08461: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.08510: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 27885 1726882529.08518: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.08558: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.08607: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 27885 1726882529.08627: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 27885 1726882529.08657: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec87881e20> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec8782e990> import 'ansible.module_utils.facts.system.user' # <<< 27885 1726882529.08700: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 27885 1726882529.08717: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.08833: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.08836: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 27885 1726882529.08953: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.09103: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 27885 1726882529.09215: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.09307: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.09350: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.09404: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 27885 1726882529.09421: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available <<< 27885 1726882529.09447: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.09464: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.09596: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.09750: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 27885 1726882529.09877: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.09999: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 27885 1726882529.10032: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.10036: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.10081: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.10615: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.11123: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 27885 1726882529.11137: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.11241: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.11355: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 27885 1726882529.11358: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.11449: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.11552: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 27885 1726882529.11722: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.11882: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 27885 1726882529.11911: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 27885 1726882529.11914: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.11967: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.11997: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 27885 1726882529.12008: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.12100: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.12200: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.12402: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.12622: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 27885 1726882529.12652: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.12696: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.12699: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 27885 1726882529.12740: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 27885 1726882529.12761: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 27885 1726882529.12825: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.12899: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available <<< 27885 1726882529.12934: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.12964: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 27885 1726882529.12980: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.13017: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.13089: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 27885 1726882529.13143: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.13218: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 27885 1726882529.13221: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.13464: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.13735: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 27885 1726882529.13739: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.13791: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.13851: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 27885 1726882529.13874: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.13890: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.13943: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available <<< 27885 1726882529.13990: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.14019: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available <<< 27885 1726882529.14044: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.14403: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available <<< 27885 1726882529.14434: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 27885 1726882529.14486: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.14534: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.14607: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.14675: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # <<< 27885 1726882529.14703: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 27885 1726882529.14752: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.14820: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 27885 1726882529.14823: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.15002: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.15205: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 27885 1726882529.15208: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.15245: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.15306: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 27885 1726882529.15309: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.15349: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.15403: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 27885 1726882529.15481: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.15568: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 27885 1726882529.15581: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.15667: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.15759: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 27885 1726882529.15829: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.16722: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 27885 1726882529.16753: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 27885 1726882529.16756: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 27885 1726882529.16801: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec8767f800> <<< 27885 1726882529.16820: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec8767e420> <<< 27885 1726882529.16868: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec8767c110> <<< 27885 1726882529.17256: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_apparmor": {"status": "disabled"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 35334 10.31.14.69 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 35334 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "35", "second": "29", "epoch": "1726882529", "epoch_int": "1726882529", "date": "2024-09-20", "time": "21:35:29", "iso8601_micro": "2024-09-21T01:35:29.162963Z", "iso8601": "2024-09-21T01:35:29Z", "iso8601_basic": "20240920T213529162963", "iso8601_basic_short": "20240920T213529", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_ssh_host_key_rsa_public": "<<< 27885 1726882529.17260: stdout chunk (state=3): >>>AAAAB3NzaC1yc2EAAAADAQABAAABgQDO9PZgr9JLdptbX1z24dINsp1ZUviCn2IFYUqfMM6j/uCKMg5pVfDr5EP5Ea09xR+KKjE9W6h445mjrxTxfVC3xCHR3VpSw3Oq+2ut1Ji+loZ+gygWU601w94ai/xsdgyml1uEyWaA+y3goILZNio8q0yQtVVMKaylDdwXYQ2zefxhpEJ2IlB2HJcJzSxCYz+Sa3mdkfG2DlXy2tqo95KEZ2m7lxzM1pkAHXup+mi3WaH4b4fHxNlRo8S/ebtmXiUYGjymQ5jck8sol0xo4LeBCRe0NKWBJZmK4X6N7Vwrb9tSp9rBJYxjQA9YCszz8i2C3Q33fP+kP2NUonq0NfFciCOt026ERL+ygggM392iXVJPF3VZfX1Pi3Z6B1PbuFZy/UE0SpwxHjWy+QRHd/SVa4YK0V3bMQ3T0bvGI2UuujjRvmDoob7j8Q4QkyY73p60sv4iob7xx/5BBlSagZNKbPiUWhOPXkHgYguuEWrbvoeQUPjhtCzQXguvY0Y6U18=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOkVDo8QW6ai2hAn3+uCY59f9/ff9I0xJwsgAdLmXdfM6LXa2YZqxM/XbCey2xlDC6ejVLDU0902Xq19HWz8n48=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIMO17OwTe9G3GI2fp+men+Q6jlxYO58zd3fpAMZ6aHgk", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-69", "ansible_nodename": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273daf4d79783f5cba36df2f56d9d0", "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 27885 1726882529.17717: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks <<< 27885 1726882529.17785: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc <<< 27885 1726882529.17789: stdout chunk (state=3): >>># cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig <<< 27885 1726882529.17862: stdout chunk (state=3): >>># cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils <<< 27885 1726882529.17866: stdout chunk (state=3): >>># cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal <<< 27885 1726882529.17946: stdout chunk (state=3): >>># cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text <<< 27885 1726882529.17952: stdout chunk (state=3): >>># cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 <<< 27885 1726882529.17985: stdout chunk (state=3): >>># cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base <<< 27885 1726882529.18053: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr <<< 27885 1726882529.18065: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 27885 1726882529.18388: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 27885 1726882529.18424: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 27885 1726882529.18461: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob<<< 27885 1726882529.18490: stdout chunk (state=3): >>> # destroy ipaddress <<< 27885 1726882529.18525: stdout chunk (state=3): >>># destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder <<< 27885 1726882529.18551: stdout chunk (state=3): >>># destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select <<< 27885 1726882529.18583: stdout chunk (state=3): >>># destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid<<< 27885 1726882529.18596: stdout chunk (state=3): >>> # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 27885 1726882529.18644: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle <<< 27885 1726882529.18696: stdout chunk (state=3): >>># destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process <<< 27885 1726882529.18727: stdout chunk (state=3): >>># destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime <<< 27885 1726882529.18786: stdout chunk (state=3): >>># destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json <<< 27885 1726882529.18789: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob <<< 27885 1726882529.18838: stdout chunk (state=3): >>># destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 27885 1726882529.18864: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux <<< 27885 1726882529.18913: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct <<< 27885 1726882529.18935: stdout chunk (state=3): >>># cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types <<< 27885 1726882529.18981: stdout chunk (state=3): >>># cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 27885 1726882529.18995: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 27885 1726882529.19125: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 27885 1726882529.19166: stdout chunk (state=3): >>># destroy _collections <<< 27885 1726882529.19182: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 27885 1726882529.19224: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 27885 1726882529.19275: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 27885 1726882529.19278: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 27885 1726882529.19365: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases <<< 27885 1726882529.19395: stdout chunk (state=3): >>># destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 27885 1726882529.19424: stdout chunk (state=3): >>># destroy _random # destroy _weakref # destroy _hashlib <<< 27885 1726882529.19449: stdout chunk (state=3): >>># destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools <<< 27885 1726882529.19469: stdout chunk (state=3): >>># destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 27885 1726882529.19808: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882529.19840: stderr chunk (state=3): >>>Shared connection to 10.31.14.69 closed. <<< 27885 1726882529.19843: stdout chunk (state=3): >>><<< 27885 1726882529.19845: stderr chunk (state=3): >>><<< 27885 1726882529.19988: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec888184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec887e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec8881aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec885c9130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec885c9fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec88607e60> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec88607f20> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec8863f890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec8863ff20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec8861fb30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec8861d250> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec88605010> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec8865f800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec8865e450> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec8861e120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec8865ccb0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec88694860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec88604290> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec88694d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec88694bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec88694fb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec88602db0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec886956a0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec88695370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec886965a0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec886ac7a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec886ade80> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec886aed20> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec886af320> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec886ae270> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec886afda0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec886af4d0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec88696510> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec883a3bf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec883cc740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec883cc4a0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec883cc680> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec883ccfe0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec883cd910> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec883cc8c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec883a1d90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec883ced20> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec883cda60> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec88696750> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec883f7080> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec8841b440> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec8847c230> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec8847e990> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec8847c350> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec88449250> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87d29310> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec8841a240> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec883cfc50> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fec87d295b0> # zipimport: found 103 names in '/tmp/ansible_setup_payload_nv6k951j/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87d92f90> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87d71e80> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87d710a0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87d91280> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec87dc2990> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87dc2720> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87dc2030> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87dc2480> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87d93c20> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec87dc3770> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec87dc3980> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87dc3e90> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87c2dca0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec87c2f890> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87c30290> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87c31400> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87c33e90> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec88602ea0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87c32180> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87c3be60> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87c3a930> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87c3a690> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87c3ac00> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87c32660> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec87c7ff80> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87c80260> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec87c81d00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87c81ac0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec87c84290> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87c823c0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87c87a70> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87c84440> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec87c887d0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec87c88c20> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec87c88ce0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87c80470> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec87b10380> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec87b115e0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87c8ab40> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec87c8bec0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87c8a780> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec87b19730> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87b1a4b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87b11460> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87b1a480> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87b1b680> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec87b26030> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87b21070> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87c0ea20> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87dee6f0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87b26150> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87b1b140> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87bb6360> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec877d01a0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec877d0590> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87ba3350> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87bb6ea0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87bb4a10> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87bb4620> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec877d3470> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec877d2d20> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec877d2f00> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec877d2150> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec877d34d0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec8782e000> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec877d3fe0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec87bb4740> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec8782fe60> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec8782ec90> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec8786e360> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec8785f140> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec87881e20> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec8782e990> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec8767f800> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec8767e420> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec8767c110> {"ansible_facts": {"ansible_apparmor": {"status": "disabled"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 35334 10.31.14.69 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 35334 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "35", "second": "29", "epoch": "1726882529", "epoch_int": "1726882529", "date": "2024-09-20", "time": "21:35:29", "iso8601_micro": "2024-09-21T01:35:29.162963Z", "iso8601": "2024-09-21T01:35:29Z", "iso8601_basic": "20240920T213529162963", "iso8601_basic_short": "20240920T213529", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDO9PZgr9JLdptbX1z24dINsp1ZUviCn2IFYUqfMM6j/uCKMg5pVfDr5EP5Ea09xR+KKjE9W6h445mjrxTxfVC3xCHR3VpSw3Oq+2ut1Ji+loZ+gygWU601w94ai/xsdgyml1uEyWaA+y3goILZNio8q0yQtVVMKaylDdwXYQ2zefxhpEJ2IlB2HJcJzSxCYz+Sa3mdkfG2DlXy2tqo95KEZ2m7lxzM1pkAHXup+mi3WaH4b4fHxNlRo8S/ebtmXiUYGjymQ5jck8sol0xo4LeBCRe0NKWBJZmK4X6N7Vwrb9tSp9rBJYxjQA9YCszz8i2C3Q33fP+kP2NUonq0NfFciCOt026ERL+ygggM392iXVJPF3VZfX1Pi3Z6B1PbuFZy/UE0SpwxHjWy+QRHd/SVa4YK0V3bMQ3T0bvGI2UuujjRvmDoob7j8Q4QkyY73p60sv4iob7xx/5BBlSagZNKbPiUWhOPXkHgYguuEWrbvoeQUPjhtCzQXguvY0Y6U18=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOkVDo8QW6ai2hAn3+uCY59f9/ff9I0xJwsgAdLmXdfM6LXa2YZqxM/XbCey2xlDC6ejVLDU0902Xq19HWz8n48=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIMO17OwTe9G3GI2fp+men+Q6jlxYO58zd3fpAMZ6aHgk", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-69", "ansible_nodename": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273daf4d79783f5cba36df2f56d9d0", "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 27885 1726882529.21183: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882528.7494233-27956-73338112538767/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27885 1726882529.21186: _low_level_execute_command(): starting 27885 1726882529.21189: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882528.7494233-27956-73338112538767/ > /dev/null 2>&1 && sleep 0' 27885 1726882529.21420: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882529.21508: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882529.21527: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882529.21539: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882529.21557: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882529.21640: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882529.23485: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882529.23556: stderr chunk (state=3): >>><<< 27885 1726882529.23560: stdout chunk (state=3): >>><<< 27885 1726882529.23578: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882529.23588: handler run complete 27885 1726882529.23631: variable 'ansible_facts' from source: unknown 27885 1726882529.23666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882529.23803: variable 'ansible_facts' from source: unknown 27885 1726882529.23807: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882529.23822: attempt loop complete, returning result 27885 1726882529.23826: _execute() done 27885 1726882529.23829: dumping result to json 27885 1726882529.23839: done dumping result, returning 27885 1726882529.23847: done running TaskExecutor() for managed_node2/TASK: Gather the minimum subset of ansible_facts required by the network role test [12673a56-9f93-3fa5-01be-0000000000d0] 27885 1726882529.23851: sending task result for task 12673a56-9f93-3fa5-01be-0000000000d0 27885 1726882529.23977: done sending task result for task 12673a56-9f93-3fa5-01be-0000000000d0 27885 1726882529.23979: WORKER PROCESS EXITING ok: [managed_node2] 27885 1726882529.24075: no more pending results, returning what we have 27885 1726882529.24078: results queue empty 27885 1726882529.24079: checking for any_errors_fatal 27885 1726882529.24080: done checking for any_errors_fatal 27885 1726882529.24080: checking for max_fail_percentage 27885 1726882529.24082: done checking for max_fail_percentage 27885 1726882529.24082: checking to see if all hosts have failed and the running result is not ok 27885 1726882529.24083: done checking to see if all hosts have failed 27885 1726882529.24084: getting the remaining hosts for this loop 27885 1726882529.24085: done getting the remaining hosts for this loop 27885 1726882529.24088: getting the next task for host managed_node2 27885 1726882529.24104: done getting next task for host managed_node2 27885 1726882529.24108: ^ task is: TASK: Check if system is ostree 27885 1726882529.24110: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882529.24114: getting variables 27885 1726882529.24115: in VariableManager get_vars() 27885 1726882529.24138: Calling all_inventory to load vars for managed_node2 27885 1726882529.24141: Calling groups_inventory to load vars for managed_node2 27885 1726882529.24143: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882529.24152: Calling all_plugins_play to load vars for managed_node2 27885 1726882529.24154: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882529.24157: Calling groups_plugins_play to load vars for managed_node2 27885 1726882529.24307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882529.24487: done with get_vars() 27885 1726882529.24502: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 21:35:29 -0400 (0:00:00.550) 0:00:01.888 ****** 27885 1726882529.24599: entering _queue_task() for managed_node2/stat 27885 1726882529.25085: worker is 1 (out of 1 available) 27885 1726882529.25098: exiting _queue_task() for managed_node2/stat 27885 1726882529.25108: done queuing things up, now waiting for results queue to drain 27885 1726882529.25109: waiting for pending results... 27885 1726882529.25131: running TaskExecutor() for managed_node2/TASK: Check if system is ostree 27885 1726882529.25256: in run() - task 12673a56-9f93-3fa5-01be-0000000000d2 27885 1726882529.25309: variable 'ansible_search_path' from source: unknown 27885 1726882529.25313: variable 'ansible_search_path' from source: unknown 27885 1726882529.25379: calling self._execute() 27885 1726882529.25447: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882529.25451: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882529.25464: variable 'omit' from source: magic vars 27885 1726882529.25886: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27885 1726882529.26111: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27885 1726882529.26154: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27885 1726882529.26185: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27885 1726882529.26238: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27885 1726882529.26326: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27885 1726882529.26338: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27885 1726882529.26420: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882529.26424: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27885 1726882529.26500: Evaluated conditional (not __network_is_ostree is defined): True 27885 1726882529.26504: variable 'omit' from source: magic vars 27885 1726882529.26542: variable 'omit' from source: magic vars 27885 1726882529.26574: variable 'omit' from source: magic vars 27885 1726882529.26597: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882529.26625: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882529.26641: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882529.26663: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882529.26666: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882529.26700: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882529.26704: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882529.26706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882529.26789: Set connection var ansible_pipelining to False 27885 1726882529.26797: Set connection var ansible_connection to ssh 27885 1726882529.26800: Set connection var ansible_timeout to 10 27885 1726882529.26802: Set connection var ansible_shell_type to sh 27885 1726882529.26808: Set connection var ansible_shell_executable to /bin/sh 27885 1726882529.26812: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882529.26831: variable 'ansible_shell_executable' from source: unknown 27885 1726882529.26834: variable 'ansible_connection' from source: unknown 27885 1726882529.26837: variable 'ansible_module_compression' from source: unknown 27885 1726882529.26839: variable 'ansible_shell_type' from source: unknown 27885 1726882529.26842: variable 'ansible_shell_executable' from source: unknown 27885 1726882529.26849: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882529.26854: variable 'ansible_pipelining' from source: unknown 27885 1726882529.26857: variable 'ansible_timeout' from source: unknown 27885 1726882529.26859: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882529.26963: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 27885 1726882529.26966: variable 'omit' from source: magic vars 27885 1726882529.26969: starting attempt loop 27885 1726882529.26971: running the handler 27885 1726882529.26983: _low_level_execute_command(): starting 27885 1726882529.26994: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27885 1726882529.27616: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882529.27642: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882529.27660: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882529.27682: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882529.27817: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882529.29406: stdout chunk (state=3): >>>/root <<< 27885 1726882529.29539: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882529.29554: stderr chunk (state=3): >>><<< 27885 1726882529.29567: stdout chunk (state=3): >>><<< 27885 1726882529.29597: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882529.29700: _low_level_execute_command(): starting 27885 1726882529.29704: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882529.296113-27981-209570674006773 `" && echo ansible-tmp-1726882529.296113-27981-209570674006773="` echo /root/.ansible/tmp/ansible-tmp-1726882529.296113-27981-209570674006773 `" ) && sleep 0' 27885 1726882529.30352: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882529.30630: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882529.30633: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882529.30654: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882529.30743: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882529.32609: stdout chunk (state=3): >>>ansible-tmp-1726882529.296113-27981-209570674006773=/root/.ansible/tmp/ansible-tmp-1726882529.296113-27981-209570674006773 <<< 27885 1726882529.32716: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882529.32776: stderr chunk (state=3): >>><<< 27885 1726882529.32784: stdout chunk (state=3): >>><<< 27885 1726882529.32810: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882529.296113-27981-209570674006773=/root/.ansible/tmp/ansible-tmp-1726882529.296113-27981-209570674006773 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882529.32868: variable 'ansible_module_compression' from source: unknown 27885 1726882529.32938: ANSIBALLZ: Using lock for stat 27885 1726882529.32945: ANSIBALLZ: Acquiring lock 27885 1726882529.32953: ANSIBALLZ: Lock acquired: 140560086931056 27885 1726882529.32961: ANSIBALLZ: Creating module 27885 1726882529.42845: ANSIBALLZ: Writing module into payload 27885 1726882529.42910: ANSIBALLZ: Writing module 27885 1726882529.42926: ANSIBALLZ: Renaming module 27885 1726882529.42932: ANSIBALLZ: Done creating module 27885 1726882529.42946: variable 'ansible_facts' from source: unknown 27885 1726882529.42992: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882529.296113-27981-209570674006773/AnsiballZ_stat.py 27885 1726882529.43094: Sending initial data 27885 1726882529.43098: Sent initial data (152 bytes) 27885 1726882529.43551: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882529.43555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882529.43557: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882529.43559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 27885 1726882529.43561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882529.43612: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882529.43615: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882529.43684: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882529.45203: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27885 1726882529.45259: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27885 1726882529.45325: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpqwxtgdjj /root/.ansible/tmp/ansible-tmp-1726882529.296113-27981-209570674006773/AnsiballZ_stat.py <<< 27885 1726882529.45327: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882529.296113-27981-209570674006773/AnsiballZ_stat.py" <<< 27885 1726882529.45384: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpqwxtgdjj" to remote "/root/.ansible/tmp/ansible-tmp-1726882529.296113-27981-209570674006773/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882529.296113-27981-209570674006773/AnsiballZ_stat.py" <<< 27885 1726882529.45970: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882529.46012: stderr chunk (state=3): >>><<< 27885 1726882529.46016: stdout chunk (state=3): >>><<< 27885 1726882529.46061: done transferring module to remote 27885 1726882529.46073: _low_level_execute_command(): starting 27885 1726882529.46077: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882529.296113-27981-209570674006773/ /root/.ansible/tmp/ansible-tmp-1726882529.296113-27981-209570674006773/AnsiballZ_stat.py && sleep 0' 27885 1726882529.46511: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882529.46514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882529.46517: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882529.46519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882529.46521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882529.46565: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882529.46569: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882529.46639: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882529.48348: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882529.48369: stderr chunk (state=3): >>><<< 27885 1726882529.48384: stdout chunk (state=3): >>><<< 27885 1726882529.48470: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882529.48473: _low_level_execute_command(): starting 27885 1726882529.48476: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882529.296113-27981-209570674006773/AnsiballZ_stat.py && sleep 0' 27885 1726882529.48976: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882529.48988: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882529.49005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882529.49020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882529.49121: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882529.49125: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882529.49147: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882529.49159: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882529.49248: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882529.51341: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 27885 1726882529.51371: stdout chunk (state=3): >>>import _imp # builtin <<< 27885 1726882529.51408: stdout chunk (state=3): >>>import '_thread' # <<< 27885 1726882529.51414: stdout chunk (state=3): >>>import '_warnings' # import '_weakref' # <<< 27885 1726882529.51475: stdout chunk (state=3): >>>import '_io' # <<< 27885 1726882529.51479: stdout chunk (state=3): >>>import 'marshal' # <<< 27885 1726882529.51512: stdout chunk (state=3): >>>import 'posix' # <<< 27885 1726882529.51542: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 27885 1726882529.51566: stdout chunk (state=3): >>>import 'time' # <<< 27885 1726882529.51577: stdout chunk (state=3): >>>import 'zipimport' # <<< 27885 1726882529.51582: stdout chunk (state=3): >>># installed zipimport hook <<< 27885 1726882529.51629: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 27885 1726882529.51636: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 27885 1726882529.51645: stdout chunk (state=3): >>>import '_codecs' # <<< 27885 1726882529.51677: stdout chunk (state=3): >>>import 'codecs' # <<< 27885 1726882529.51720: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 27885 1726882529.51758: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 27885 1726882529.51778: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958fb44d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958f83b00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958fb6a50> <<< 27885 1726882529.51824: stdout chunk (state=3): >>>import '_signal' # <<< 27885 1726882529.51874: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 27885 1726882529.51899: stdout chunk (state=3): >>>import 'io' # import '_stat' # import 'stat' # <<< 27885 1726882529.51987: stdout chunk (state=3): >>>import '_collections_abc' # <<< 27885 1726882529.52008: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 27885 1726882529.52041: stdout chunk (state=3): >>>import 'os' # <<< 27885 1726882529.52047: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages <<< 27885 1726882529.52081: stdout chunk (state=3): >>>Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 27885 1726882529.52084: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 27885 1726882529.52123: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 27885 1726882529.52136: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958d65130> <<< 27885 1726882529.52205: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 27885 1726882529.52220: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958d65fa0> <<< 27885 1726882529.52236: stdout chunk (state=3): >>>import 'site' # <<< 27885 1726882529.52263: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 27885 1726882529.52555: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 27885 1726882529.52607: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 27885 1726882529.52648: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 27885 1726882529.52669: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958da3e60> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 27885 1726882529.52679: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 27885 1726882529.52701: stdout chunk (state=3): >>>import '_operator' # <<< 27885 1726882529.52710: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958da3ef0> <<< 27885 1726882529.52720: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 27885 1726882529.52746: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 27885 1726882529.52764: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 27885 1726882529.52814: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 27885 1726882529.52829: stdout chunk (state=3): >>>import 'itertools' # <<< 27885 1726882529.52863: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' <<< 27885 1726882529.52866: stdout chunk (state=3): >>>import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958ddb860> <<< 27885 1726882529.52891: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 27885 1726882529.52898: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 27885 1726882529.52910: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958ddbef0> <<< 27885 1726882529.52915: stdout chunk (state=3): >>>import '_collections' # <<< 27885 1726882529.52963: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958dbbb30> <<< 27885 1726882529.52968: stdout chunk (state=3): >>>import '_functools' # <<< 27885 1726882529.52997: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958db9220> <<< 27885 1726882529.53085: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958da1010> <<< 27885 1726882529.53116: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 27885 1726882529.53129: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 27885 1726882529.53144: stdout chunk (state=3): >>>import '_sre' # <<< 27885 1726882529.53161: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 27885 1726882529.53187: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 27885 1726882529.53213: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 27885 1726882529.53217: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 27885 1726882529.53241: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958dfb7a0> <<< 27885 1726882529.53259: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958dfa3c0> <<< 27885 1726882529.53285: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 27885 1726882529.53292: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958dba0f0> <<< 27885 1726882529.53305: stdout chunk (state=3): >>>import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958da28d0> <<< 27885 1726882529.53349: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 27885 1726882529.53362: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958e307d0> <<< 27885 1726882529.53374: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958da0290> <<< 27885 1726882529.53386: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 27885 1726882529.53419: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 27885 1726882529.53430: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa958e30c80> <<< 27885 1726882529.53435: stdout chunk (state=3): >>>import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958e30b30> <<< 27885 1726882529.53465: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 27885 1726882529.53468: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa958e30f20> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958d9edb0> <<< 27885 1726882529.53503: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 27885 1726882529.53511: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 27885 1726882529.53521: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 27885 1726882529.53563: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958e315e0> <<< 27885 1726882529.53568: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958e312b0> import 'importlib.machinery' # <<< 27885 1726882529.53604: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py <<< 27885 1726882529.53616: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 27885 1726882529.53631: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958e324b0> <<< 27885 1726882529.53634: stdout chunk (state=3): >>>import 'importlib.util' # <<< 27885 1726882529.53655: stdout chunk (state=3): >>>import 'runpy' # <<< 27885 1726882529.53666: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 27885 1726882529.53701: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 27885 1726882529.53725: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 27885 1726882529.53733: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958e486b0> <<< 27885 1726882529.53746: stdout chunk (state=3): >>>import 'errno' # <<< 27885 1726882529.53775: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 27885 1726882529.53783: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa958e49d60> <<< 27885 1726882529.53803: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 27885 1726882529.53809: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 27885 1726882529.53837: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 27885 1726882529.53841: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 27885 1726882529.53851: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958e4ac00> <<< 27885 1726882529.53890: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 27885 1726882529.53901: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa958e4b260> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958e4a150> <<< 27885 1726882529.53914: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 27885 1726882529.53926: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 27885 1726882529.53960: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 27885 1726882529.53982: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa958e4bce0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958e4b410> <<< 27885 1726882529.54024: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958e32420> <<< 27885 1726882529.54043: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 27885 1726882529.54064: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 27885 1726882529.54095: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 27885 1726882529.54105: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 27885 1726882529.54140: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' <<< 27885 1726882529.54145: stdout chunk (state=3): >>>import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa958bc3c50> <<< 27885 1726882529.54165: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py <<< 27885 1726882529.54169: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 27885 1726882529.54204: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 27885 1726882529.54206: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa958bec6b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958bec410> <<< 27885 1726882529.54229: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa958bec6e0> <<< 27885 1726882529.54258: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 27885 1726882529.54338: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 27885 1726882529.54466: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa958bed010> <<< 27885 1726882529.54697: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa958beda00> <<< 27885 1726882529.54701: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958bec8c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958bc1df0> <<< 27885 1726882529.54703: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958beee10> <<< 27885 1726882529.54928: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958bedb50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958e32bd0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958c171a0> <<< 27885 1726882529.54935: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 27885 1726882529.54953: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 27885 1726882529.54967: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 27885 1726882529.54996: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 27885 1726882529.55027: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958c3b500> <<< 27885 1726882529.55051: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 27885 1726882529.55096: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 27885 1726882529.55147: stdout chunk (state=3): >>>import 'ntpath' # <<< 27885 1726882529.55162: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' <<< 27885 1726882529.55180: stdout chunk (state=3): >>>import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958c9c230> <<< 27885 1726882529.55190: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 27885 1726882529.55219: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 27885 1726882529.55241: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 27885 1726882529.55281: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 27885 1726882529.55365: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958c9e990> <<< 27885 1726882529.55445: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958c9c350> <<< 27885 1726882529.55469: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958c61250> <<< 27885 1726882529.55505: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 27885 1726882529.55525: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958521370> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958c3a300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958befd70> <<< 27885 1726882529.55634: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 27885 1726882529.55645: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fa958521610> <<< 27885 1726882529.55814: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_kb_1t9f1/ansible_stat_payload.zip' <<< 27885 1726882529.55824: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.55939: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.55976: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 27885 1726882529.55979: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 27885 1726882529.56021: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 27885 1726882529.56086: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 27885 1726882529.56121: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9585770b0> <<< 27885 1726882529.56141: stdout chunk (state=3): >>>import '_typing' # <<< 27885 1726882529.56313: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958555fa0> <<< 27885 1726882529.56335: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958555130> # zipimport: zlib available <<< 27885 1726882529.56373: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 27885 1726882529.56402: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 27885 1726882529.56406: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 27885 1726882529.56426: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.57774: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.58875: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958574e00> <<< 27885 1726882529.58906: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 27885 1726882529.58933: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 27885 1726882529.58962: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 27885 1726882529.58982: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 27885 1726882529.59005: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa95859e9f0> <<< 27885 1726882529.59027: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa95859e780> <<< 27885 1726882529.59065: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa95859e090> <<< 27885 1726882529.59085: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 27885 1726882529.59137: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa95859e4e0> <<< 27885 1726882529.59155: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958577b30> import 'atexit' # <<< 27885 1726882529.59177: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa95859f770> <<< 27885 1726882529.59204: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa95859f950> <<< 27885 1726882529.59224: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 27885 1726882529.59310: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 27885 1726882529.59324: stdout chunk (state=3): >>>import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa95859fe90> import 'pwd' # <<< 27885 1726882529.59351: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 27885 1726882529.59369: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 27885 1726882529.59403: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958409c70> <<< 27885 1726882529.59440: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa95840b890> <<< 27885 1726882529.59462: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 27885 1726882529.59480: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 27885 1726882529.59507: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa95840c290> <<< 27885 1726882529.59529: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 27885 1726882529.59549: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 27885 1726882529.59577: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa95840d430> <<< 27885 1726882529.59602: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 27885 1726882529.59650: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 27885 1726882529.59672: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 27885 1726882529.59704: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa95840fec0> <<< 27885 1726882529.59745: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa958557140> <<< 27885 1726882529.59763: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa95840e180> <<< 27885 1726882529.59800: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 27885 1726882529.59834: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py <<< 27885 1726882529.59861: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 27885 1726882529.59878: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 27885 1726882529.59915: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958417e60> <<< 27885 1726882529.59948: stdout chunk (state=3): >>>import '_tokenize' # <<< 27885 1726882529.59988: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958416930> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958416690> <<< 27885 1726882529.60015: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 27885 1726882529.60088: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958416c00> <<< 27885 1726882529.60117: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa95840e690> <<< 27885 1726882529.60160: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa95845f9e0> <<< 27885 1726882529.60189: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958460170> <<< 27885 1726882529.60207: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 27885 1726882529.60241: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 27885 1726882529.60286: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 27885 1726882529.60292: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa958461be0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9584619a0> <<< 27885 1726882529.60313: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 27885 1726882529.60404: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 27885 1726882529.60458: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9584641a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9584622d0> <<< 27885 1726882529.60485: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 27885 1726882529.60537: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 27885 1726882529.60563: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 27885 1726882529.60607: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958467980> <<< 27885 1726882529.60721: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958464350> <<< 27885 1726882529.60782: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa958468770> <<< 27885 1726882529.60820: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa958468830> <<< 27885 1726882529.60865: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 27885 1726882529.60870: stdout chunk (state=3): >>>import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa958468ad0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958460350> <<< 27885 1726882529.60898: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 27885 1726882529.60929: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 27885 1726882529.60955: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 27885 1726882529.60963: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 27885 1726882529.60985: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9584f4290> <<< 27885 1726882529.61132: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 27885 1726882529.61155: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9584f5610> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa95846aa20> <<< 27885 1726882529.61202: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa95846bdd0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa95846a690> <<< 27885 1726882529.61223: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.61226: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 27885 1726882529.61251: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.61315: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.61425: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.61430: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 27885 1726882529.61468: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 27885 1726882529.61473: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.61592: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.61701: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.62219: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.62781: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 27885 1726882529.62788: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # <<< 27885 1726882529.62827: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 27885 1726882529.62860: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9584f9730> <<< 27885 1726882529.62956: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 27885 1726882529.62960: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9584fa450> <<< 27885 1726882529.63005: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9584f5730> <<< 27885 1726882529.63051: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available <<< 27885 1726882529.63075: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 27885 1726882529.63211: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.63521: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9584fa150> # zipimport: zlib available <<< 27885 1726882529.63841: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.64279: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.64348: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.64425: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 27885 1726882529.64443: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.64469: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.64503: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 27885 1726882529.64530: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.64579: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.64663: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 27885 1726882529.64696: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 27885 1726882529.64717: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.64758: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.64795: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 27885 1726882529.64808: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.65039: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.65254: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 27885 1726882529.65327: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 27885 1726882529.65330: stdout chunk (state=3): >>>import '_ast' # <<< 27885 1726882529.65406: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9584fb500> <<< 27885 1726882529.65410: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.65465: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.65547: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 27885 1726882529.65577: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 27885 1726882529.65588: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.65630: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.65682: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 27885 1726882529.65922: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.65980: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 27885 1726882529.66063: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa958306180> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958300d40> <<< 27885 1726882529.66067: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 27885 1726882529.66070: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.66135: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.66215: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.66226: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.66262: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 27885 1726882529.66314: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 27885 1726882529.66326: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 27885 1726882529.66522: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9585d6a50> <<< 27885 1726882529.66525: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9585ee720> <<< 27885 1726882529.66592: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958305f40> <<< 27885 1726882529.66608: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9584f59d0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 27885 1726882529.66640: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.66665: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 27885 1726882529.66796: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 27885 1726882529.66898: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.67070: stdout chunk (state=3): >>># zipimport: zlib available <<< 27885 1726882529.67184: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 27885 1726882529.67212: stdout chunk (state=3): >>># destroy __main__ <<< 27885 1726882529.67470: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 <<< 27885 1726882529.67513: stdout chunk (state=3): >>># clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat <<< 27885 1726882529.67554: stdout chunk (state=3): >>># cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib<<< 27885 1726882529.67583: stdout chunk (state=3): >>> # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible <<< 27885 1726882529.67623: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections <<< 27885 1726882529.67673: stdout chunk (state=3): >>># destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 27885 1726882529.67938: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 27885 1726882529.67946: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma <<< 27885 1726882529.67979: stdout chunk (state=3): >>># destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress <<< 27885 1726882529.68013: stdout chunk (state=3): >>># destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile <<< 27885 1726882529.68087: stdout chunk (state=3): >>># destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess <<< 27885 1726882529.68096: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 27885 1726882529.68121: stdout chunk (state=3): >>># destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 27885 1726882529.68186: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux <<< 27885 1726882529.68220: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings <<< 27885 1726882529.68269: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types <<< 27885 1726882529.68299: stdout chunk (state=3): >>># cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 27885 1726882529.68441: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 27885 1726882529.68480: stdout chunk (state=3): >>># destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 27885 1726882529.68523: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 27885 1726882529.68544: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 27885 1726882529.68561: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 27885 1726882529.68652: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading <<< 27885 1726882529.68703: stdout chunk (state=3): >>># destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref <<< 27885 1726882529.68707: stdout chunk (state=3): >>># destroy _hashlib <<< 27885 1726882529.68785: stdout chunk (state=3): >>># destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 27885 1726882529.68788: stdout chunk (state=3): >>># clear sys.audit hooks <<< 27885 1726882529.69207: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 27885 1726882529.69211: stdout chunk (state=3): >>><<< 27885 1726882529.69213: stderr chunk (state=3): >>><<< 27885 1726882529.69523: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958fb44d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958f83b00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958fb6a50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958d65130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958d65fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958da3e60> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958da3ef0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958ddb860> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958ddbef0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958dbbb30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958db9220> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958da1010> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958dfb7a0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958dfa3c0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958dba0f0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958da28d0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958e307d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958da0290> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa958e30c80> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958e30b30> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa958e30f20> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958d9edb0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958e315e0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958e312b0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958e324b0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958e486b0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa958e49d60> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958e4ac00> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa958e4b260> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958e4a150> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa958e4bce0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958e4b410> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958e32420> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa958bc3c50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa958bec6b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958bec410> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa958bec6e0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa958bed010> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa958beda00> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958bec8c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958bc1df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958beee10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958bedb50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958e32bd0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958c171a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958c3b500> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958c9c230> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958c9e990> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958c9c350> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958c61250> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958521370> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958c3a300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958befd70> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fa958521610> # zipimport: found 30 names in '/tmp/ansible_stat_payload_kb_1t9f1/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9585770b0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958555fa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958555130> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958574e00> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa95859e9f0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa95859e780> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa95859e090> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa95859e4e0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958577b30> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa95859f770> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa95859f950> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa95859fe90> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958409c70> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa95840b890> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa95840c290> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa95840d430> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa95840fec0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa958557140> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa95840e180> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958417e60> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958416930> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958416690> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958416c00> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa95840e690> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa95845f9e0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958460170> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa958461be0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9584619a0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9584641a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9584622d0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958467980> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958464350> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa958468770> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa958468830> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa958468ad0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958460350> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9584f4290> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9584f5610> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa95846aa20> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa95846bdd0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa95846a690> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9584f9730> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9584fa450> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9584f5730> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9584fa150> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9584fb500> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa958306180> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958300d40> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9585d6a50> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9585ee720> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa958305f40> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9584f59d0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 27885 1726882529.70641: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882529.296113-27981-209570674006773/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27885 1726882529.70644: _low_level_execute_command(): starting 27885 1726882529.70646: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882529.296113-27981-209570674006773/ > /dev/null 2>&1 && sleep 0' 27885 1726882529.70830: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882529.70861: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 27885 1726882529.70875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882529.70977: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882529.71028: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882529.71055: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882529.71106: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882529.71160: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882529.73299: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882529.73303: stdout chunk (state=3): >>><<< 27885 1726882529.73306: stderr chunk (state=3): >>><<< 27885 1726882529.73308: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882529.73311: handler run complete 27885 1726882529.73313: attempt loop complete, returning result 27885 1726882529.73315: _execute() done 27885 1726882529.73317: dumping result to json 27885 1726882529.73319: done dumping result, returning 27885 1726882529.73321: done running TaskExecutor() for managed_node2/TASK: Check if system is ostree [12673a56-9f93-3fa5-01be-0000000000d2] 27885 1726882529.73323: sending task result for task 12673a56-9f93-3fa5-01be-0000000000d2 27885 1726882529.73387: done sending task result for task 12673a56-9f93-3fa5-01be-0000000000d2 27885 1726882529.73395: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 27885 1726882529.73459: no more pending results, returning what we have 27885 1726882529.73463: results queue empty 27885 1726882529.73464: checking for any_errors_fatal 27885 1726882529.73470: done checking for any_errors_fatal 27885 1726882529.73471: checking for max_fail_percentage 27885 1726882529.73473: done checking for max_fail_percentage 27885 1726882529.73474: checking to see if all hosts have failed and the running result is not ok 27885 1726882529.73475: done checking to see if all hosts have failed 27885 1726882529.73475: getting the remaining hosts for this loop 27885 1726882529.73477: done getting the remaining hosts for this loop 27885 1726882529.73481: getting the next task for host managed_node2 27885 1726882529.73487: done getting next task for host managed_node2 27885 1726882529.73494: ^ task is: TASK: Set flag to indicate system is ostree 27885 1726882529.73498: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882529.73502: getting variables 27885 1726882529.73503: in VariableManager get_vars() 27885 1726882529.73630: Calling all_inventory to load vars for managed_node2 27885 1726882529.73633: Calling groups_inventory to load vars for managed_node2 27885 1726882529.73637: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882529.73648: Calling all_plugins_play to load vars for managed_node2 27885 1726882529.73651: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882529.73654: Calling groups_plugins_play to load vars for managed_node2 27885 1726882529.74047: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882529.74799: done with get_vars() 27885 1726882529.74810: done getting variables 27885 1726882529.74916: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 21:35:29 -0400 (0:00:00.503) 0:00:02.391 ****** 27885 1726882529.74944: entering _queue_task() for managed_node2/set_fact 27885 1726882529.74951: Creating lock for set_fact 27885 1726882529.75271: worker is 1 (out of 1 available) 27885 1726882529.75401: exiting _queue_task() for managed_node2/set_fact 27885 1726882529.75412: done queuing things up, now waiting for results queue to drain 27885 1726882529.75414: waiting for pending results... 27885 1726882529.75710: running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree 27885 1726882529.75720: in run() - task 12673a56-9f93-3fa5-01be-0000000000d3 27885 1726882529.75723: variable 'ansible_search_path' from source: unknown 27885 1726882529.75726: variable 'ansible_search_path' from source: unknown 27885 1726882529.75734: calling self._execute() 27885 1726882529.75815: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882529.75833: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882529.75847: variable 'omit' from source: magic vars 27885 1726882529.76487: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27885 1726882529.76799: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27885 1726882529.76804: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27885 1726882529.76840: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27885 1726882529.76920: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27885 1726882529.76969: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27885 1726882529.77006: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27885 1726882529.77044: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882529.77076: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27885 1726882529.77217: Evaluated conditional (not __network_is_ostree is defined): True 27885 1726882529.77246: variable 'omit' from source: magic vars 27885 1726882529.77275: variable 'omit' from source: magic vars 27885 1726882529.77465: variable '__ostree_booted_stat' from source: set_fact 27885 1726882529.77469: variable 'omit' from source: magic vars 27885 1726882529.77505: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882529.77540: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882529.77563: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882529.77600: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882529.77616: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882529.77651: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882529.77683: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882529.77694: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882529.77780: Set connection var ansible_pipelining to False 27885 1726882529.77905: Set connection var ansible_connection to ssh 27885 1726882529.77909: Set connection var ansible_timeout to 10 27885 1726882529.77912: Set connection var ansible_shell_type to sh 27885 1726882529.77914: Set connection var ansible_shell_executable to /bin/sh 27885 1726882529.77916: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882529.77920: variable 'ansible_shell_executable' from source: unknown 27885 1726882529.77922: variable 'ansible_connection' from source: unknown 27885 1726882529.77924: variable 'ansible_module_compression' from source: unknown 27885 1726882529.77926: variable 'ansible_shell_type' from source: unknown 27885 1726882529.77927: variable 'ansible_shell_executable' from source: unknown 27885 1726882529.77929: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882529.77931: variable 'ansible_pipelining' from source: unknown 27885 1726882529.77933: variable 'ansible_timeout' from source: unknown 27885 1726882529.77934: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882529.78054: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882529.78069: variable 'omit' from source: magic vars 27885 1726882529.78079: starting attempt loop 27885 1726882529.78086: running the handler 27885 1726882529.78199: handler run complete 27885 1726882529.78203: attempt loop complete, returning result 27885 1726882529.78206: _execute() done 27885 1726882529.78208: dumping result to json 27885 1726882529.78210: done dumping result, returning 27885 1726882529.78227: done running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree [12673a56-9f93-3fa5-01be-0000000000d3] 27885 1726882529.78231: sending task result for task 12673a56-9f93-3fa5-01be-0000000000d3 27885 1726882529.78554: done sending task result for task 12673a56-9f93-3fa5-01be-0000000000d3 27885 1726882529.78557: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 27885 1726882529.78621: no more pending results, returning what we have 27885 1726882529.78624: results queue empty 27885 1726882529.78625: checking for any_errors_fatal 27885 1726882529.78631: done checking for any_errors_fatal 27885 1726882529.78632: checking for max_fail_percentage 27885 1726882529.78634: done checking for max_fail_percentage 27885 1726882529.78634: checking to see if all hosts have failed and the running result is not ok 27885 1726882529.78635: done checking to see if all hosts have failed 27885 1726882529.78636: getting the remaining hosts for this loop 27885 1726882529.78638: done getting the remaining hosts for this loop 27885 1726882529.78642: getting the next task for host managed_node2 27885 1726882529.78650: done getting next task for host managed_node2 27885 1726882529.78653: ^ task is: TASK: Fix CentOS6 Base repo 27885 1726882529.78656: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882529.78660: getting variables 27885 1726882529.78662: in VariableManager get_vars() 27885 1726882529.78813: Calling all_inventory to load vars for managed_node2 27885 1726882529.78817: Calling groups_inventory to load vars for managed_node2 27885 1726882529.78820: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882529.78831: Calling all_plugins_play to load vars for managed_node2 27885 1726882529.78834: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882529.78843: Calling groups_plugins_play to load vars for managed_node2 27885 1726882529.79511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882529.79978: done with get_vars() 27885 1726882529.79987: done getting variables 27885 1726882529.80119: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 21:35:29 -0400 (0:00:00.051) 0:00:02.443 ****** 27885 1726882529.80146: entering _queue_task() for managed_node2/copy 27885 1726882529.80448: worker is 1 (out of 1 available) 27885 1726882529.80459: exiting _queue_task() for managed_node2/copy 27885 1726882529.80471: done queuing things up, now waiting for results queue to drain 27885 1726882529.80472: waiting for pending results... 27885 1726882529.80740: running TaskExecutor() for managed_node2/TASK: Fix CentOS6 Base repo 27885 1726882529.80849: in run() - task 12673a56-9f93-3fa5-01be-0000000000d5 27885 1726882529.80898: variable 'ansible_search_path' from source: unknown 27885 1726882529.80902: variable 'ansible_search_path' from source: unknown 27885 1726882529.80918: calling self._execute() 27885 1726882529.81000: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882529.81014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882529.81043: variable 'omit' from source: magic vars 27885 1726882529.81587: variable 'ansible_distribution' from source: facts 27885 1726882529.81592: Evaluated conditional (ansible_distribution == 'CentOS'): True 27885 1726882529.81650: variable 'ansible_distribution_major_version' from source: facts 27885 1726882529.81660: Evaluated conditional (ansible_distribution_major_version == '6'): False 27885 1726882529.81667: when evaluation is False, skipping this task 27885 1726882529.81672: _execute() done 27885 1726882529.81678: dumping result to json 27885 1726882529.81684: done dumping result, returning 27885 1726882529.81704: done running TaskExecutor() for managed_node2/TASK: Fix CentOS6 Base repo [12673a56-9f93-3fa5-01be-0000000000d5] 27885 1726882529.81713: sending task result for task 12673a56-9f93-3fa5-01be-0000000000d5 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 27885 1726882529.81973: no more pending results, returning what we have 27885 1726882529.81976: results queue empty 27885 1726882529.81977: checking for any_errors_fatal 27885 1726882529.81983: done checking for any_errors_fatal 27885 1726882529.81984: checking for max_fail_percentage 27885 1726882529.81985: done checking for max_fail_percentage 27885 1726882529.81986: checking to see if all hosts have failed and the running result is not ok 27885 1726882529.81987: done checking to see if all hosts have failed 27885 1726882529.81987: getting the remaining hosts for this loop 27885 1726882529.81989: done getting the remaining hosts for this loop 27885 1726882529.81997: getting the next task for host managed_node2 27885 1726882529.82004: done getting next task for host managed_node2 27885 1726882529.82006: ^ task is: TASK: Include the task 'enable_epel.yml' 27885 1726882529.82010: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882529.82014: getting variables 27885 1726882529.82015: in VariableManager get_vars() 27885 1726882529.82043: Calling all_inventory to load vars for managed_node2 27885 1726882529.82046: Calling groups_inventory to load vars for managed_node2 27885 1726882529.82049: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882529.82061: Calling all_plugins_play to load vars for managed_node2 27885 1726882529.82063: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882529.82066: Calling groups_plugins_play to load vars for managed_node2 27885 1726882529.82403: done sending task result for task 12673a56-9f93-3fa5-01be-0000000000d5 27885 1726882529.82407: WORKER PROCESS EXITING 27885 1726882529.82435: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882529.82686: done with get_vars() 27885 1726882529.82699: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 21:35:29 -0400 (0:00:00.026) 0:00:02.470 ****** 27885 1726882529.82788: entering _queue_task() for managed_node2/include_tasks 27885 1726882529.83108: worker is 1 (out of 1 available) 27885 1726882529.83120: exiting _queue_task() for managed_node2/include_tasks 27885 1726882529.83131: done queuing things up, now waiting for results queue to drain 27885 1726882529.83133: waiting for pending results... 27885 1726882529.83379: running TaskExecutor() for managed_node2/TASK: Include the task 'enable_epel.yml' 27885 1726882529.83487: in run() - task 12673a56-9f93-3fa5-01be-0000000000d6 27885 1726882529.83517: variable 'ansible_search_path' from source: unknown 27885 1726882529.83600: variable 'ansible_search_path' from source: unknown 27885 1726882529.83604: calling self._execute() 27885 1726882529.83652: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882529.83663: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882529.83700: variable 'omit' from source: magic vars 27885 1726882529.84189: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27885 1726882529.86483: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27885 1726882529.86598: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27885 1726882529.86601: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27885 1726882529.86633: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27885 1726882529.86674: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27885 1726882529.86761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882529.86803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882529.86835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882529.86887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882529.86977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882529.87028: variable '__network_is_ostree' from source: set_fact 27885 1726882529.87049: Evaluated conditional (not __network_is_ostree | d(false)): True 27885 1726882529.87059: _execute() done 27885 1726882529.87066: dumping result to json 27885 1726882529.87073: done dumping result, returning 27885 1726882529.87087: done running TaskExecutor() for managed_node2/TASK: Include the task 'enable_epel.yml' [12673a56-9f93-3fa5-01be-0000000000d6] 27885 1726882529.87101: sending task result for task 12673a56-9f93-3fa5-01be-0000000000d6 27885 1726882529.87223: no more pending results, returning what we have 27885 1726882529.87228: in VariableManager get_vars() 27885 1726882529.87259: Calling all_inventory to load vars for managed_node2 27885 1726882529.87263: Calling groups_inventory to load vars for managed_node2 27885 1726882529.87267: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882529.87278: Calling all_plugins_play to load vars for managed_node2 27885 1726882529.87281: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882529.87284: Calling groups_plugins_play to load vars for managed_node2 27885 1726882529.87718: done sending task result for task 12673a56-9f93-3fa5-01be-0000000000d6 27885 1726882529.87722: WORKER PROCESS EXITING 27885 1726882529.87744: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882529.87959: done with get_vars() 27885 1726882529.87967: variable 'ansible_search_path' from source: unknown 27885 1726882529.87968: variable 'ansible_search_path' from source: unknown 27885 1726882529.88006: we have included files to process 27885 1726882529.88007: generating all_blocks data 27885 1726882529.88009: done generating all_blocks data 27885 1726882529.88013: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 27885 1726882529.88015: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 27885 1726882529.88018: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 27885 1726882529.88721: done processing included file 27885 1726882529.88723: iterating over new_blocks loaded from include file 27885 1726882529.88725: in VariableManager get_vars() 27885 1726882529.88736: done with get_vars() 27885 1726882529.88737: filtering new block on tags 27885 1726882529.88758: done filtering new block on tags 27885 1726882529.88761: in VariableManager get_vars() 27885 1726882529.88770: done with get_vars() 27885 1726882529.88772: filtering new block on tags 27885 1726882529.88782: done filtering new block on tags 27885 1726882529.88784: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node2 27885 1726882529.88789: extending task lists for all hosts with included blocks 27885 1726882529.88892: done extending task lists 27885 1726882529.88895: done processing included files 27885 1726882529.88896: results queue empty 27885 1726882529.88897: checking for any_errors_fatal 27885 1726882529.88899: done checking for any_errors_fatal 27885 1726882529.88900: checking for max_fail_percentage 27885 1726882529.88901: done checking for max_fail_percentage 27885 1726882529.88902: checking to see if all hosts have failed and the running result is not ok 27885 1726882529.88903: done checking to see if all hosts have failed 27885 1726882529.88903: getting the remaining hosts for this loop 27885 1726882529.88904: done getting the remaining hosts for this loop 27885 1726882529.88906: getting the next task for host managed_node2 27885 1726882529.88914: done getting next task for host managed_node2 27885 1726882529.88917: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 27885 1726882529.88920: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882529.88922: getting variables 27885 1726882529.88923: in VariableManager get_vars() 27885 1726882529.88930: Calling all_inventory to load vars for managed_node2 27885 1726882529.88932: Calling groups_inventory to load vars for managed_node2 27885 1726882529.88934: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882529.88938: Calling all_plugins_play to load vars for managed_node2 27885 1726882529.88946: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882529.88949: Calling groups_plugins_play to load vars for managed_node2 27885 1726882529.89108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882529.89317: done with get_vars() 27885 1726882529.89326: done getting variables 27885 1726882529.89395: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 27885 1726882529.89597: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 21:35:29 -0400 (0:00:00.068) 0:00:02.538 ****** 27885 1726882529.89642: entering _queue_task() for managed_node2/command 27885 1726882529.89644: Creating lock for command 27885 1726882529.90105: worker is 1 (out of 1 available) 27885 1726882529.90112: exiting _queue_task() for managed_node2/command 27885 1726882529.90122: done queuing things up, now waiting for results queue to drain 27885 1726882529.90123: waiting for pending results... 27885 1726882529.90169: running TaskExecutor() for managed_node2/TASK: Create EPEL 10 27885 1726882529.90277: in run() - task 12673a56-9f93-3fa5-01be-0000000000f0 27885 1726882529.90298: variable 'ansible_search_path' from source: unknown 27885 1726882529.90307: variable 'ansible_search_path' from source: unknown 27885 1726882529.90350: calling self._execute() 27885 1726882529.90427: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882529.90440: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882529.90461: variable 'omit' from source: magic vars 27885 1726882529.90829: variable 'ansible_distribution' from source: facts 27885 1726882529.90845: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 27885 1726882529.90974: variable 'ansible_distribution_major_version' from source: facts 27885 1726882529.90985: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 27885 1726882529.90998: when evaluation is False, skipping this task 27885 1726882529.91010: _execute() done 27885 1726882529.91021: dumping result to json 27885 1726882529.91030: done dumping result, returning 27885 1726882529.91042: done running TaskExecutor() for managed_node2/TASK: Create EPEL 10 [12673a56-9f93-3fa5-01be-0000000000f0] 27885 1726882529.91110: sending task result for task 12673a56-9f93-3fa5-01be-0000000000f0 27885 1726882529.91180: done sending task result for task 12673a56-9f93-3fa5-01be-0000000000f0 27885 1726882529.91183: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 27885 1726882529.91270: no more pending results, returning what we have 27885 1726882529.91274: results queue empty 27885 1726882529.91275: checking for any_errors_fatal 27885 1726882529.91276: done checking for any_errors_fatal 27885 1726882529.91277: checking for max_fail_percentage 27885 1726882529.91278: done checking for max_fail_percentage 27885 1726882529.91279: checking to see if all hosts have failed and the running result is not ok 27885 1726882529.91280: done checking to see if all hosts have failed 27885 1726882529.91280: getting the remaining hosts for this loop 27885 1726882529.91284: done getting the remaining hosts for this loop 27885 1726882529.91288: getting the next task for host managed_node2 27885 1726882529.91296: done getting next task for host managed_node2 27885 1726882529.91299: ^ task is: TASK: Install yum-utils package 27885 1726882529.91303: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882529.91306: getting variables 27885 1726882529.91308: in VariableManager get_vars() 27885 1726882529.91342: Calling all_inventory to load vars for managed_node2 27885 1726882529.91344: Calling groups_inventory to load vars for managed_node2 27885 1726882529.91348: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882529.91361: Calling all_plugins_play to load vars for managed_node2 27885 1726882529.91364: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882529.91367: Calling groups_plugins_play to load vars for managed_node2 27885 1726882529.91735: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882529.91971: done with get_vars() 27885 1726882529.91985: done getting variables 27885 1726882529.92073: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 21:35:29 -0400 (0:00:00.024) 0:00:02.563 ****** 27885 1726882529.92106: entering _queue_task() for managed_node2/package 27885 1726882529.92109: Creating lock for package 27885 1726882529.92349: worker is 1 (out of 1 available) 27885 1726882529.92361: exiting _queue_task() for managed_node2/package 27885 1726882529.92375: done queuing things up, now waiting for results queue to drain 27885 1726882529.92376: waiting for pending results... 27885 1726882529.92711: running TaskExecutor() for managed_node2/TASK: Install yum-utils package 27885 1726882529.92745: in run() - task 12673a56-9f93-3fa5-01be-0000000000f1 27885 1726882529.92754: variable 'ansible_search_path' from source: unknown 27885 1726882529.92757: variable 'ansible_search_path' from source: unknown 27885 1726882529.92788: calling self._execute() 27885 1726882529.92900: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882529.92903: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882529.92906: variable 'omit' from source: magic vars 27885 1726882529.93296: variable 'ansible_distribution' from source: facts 27885 1726882529.93314: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 27885 1726882529.93598: variable 'ansible_distribution_major_version' from source: facts 27885 1726882529.93602: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 27885 1726882529.93604: when evaluation is False, skipping this task 27885 1726882529.93607: _execute() done 27885 1726882529.93609: dumping result to json 27885 1726882529.93611: done dumping result, returning 27885 1726882529.93613: done running TaskExecutor() for managed_node2/TASK: Install yum-utils package [12673a56-9f93-3fa5-01be-0000000000f1] 27885 1726882529.93615: sending task result for task 12673a56-9f93-3fa5-01be-0000000000f1 27885 1726882529.93676: done sending task result for task 12673a56-9f93-3fa5-01be-0000000000f1 27885 1726882529.93679: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 27885 1726882529.93729: no more pending results, returning what we have 27885 1726882529.93732: results queue empty 27885 1726882529.93733: checking for any_errors_fatal 27885 1726882529.93738: done checking for any_errors_fatal 27885 1726882529.93739: checking for max_fail_percentage 27885 1726882529.93740: done checking for max_fail_percentage 27885 1726882529.93741: checking to see if all hosts have failed and the running result is not ok 27885 1726882529.93742: done checking to see if all hosts have failed 27885 1726882529.93742: getting the remaining hosts for this loop 27885 1726882529.93744: done getting the remaining hosts for this loop 27885 1726882529.93748: getting the next task for host managed_node2 27885 1726882529.93754: done getting next task for host managed_node2 27885 1726882529.93756: ^ task is: TASK: Enable EPEL 7 27885 1726882529.93761: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882529.93764: getting variables 27885 1726882529.93765: in VariableManager get_vars() 27885 1726882529.93797: Calling all_inventory to load vars for managed_node2 27885 1726882529.93800: Calling groups_inventory to load vars for managed_node2 27885 1726882529.93803: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882529.93816: Calling all_plugins_play to load vars for managed_node2 27885 1726882529.93819: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882529.93822: Calling groups_plugins_play to load vars for managed_node2 27885 1726882529.94113: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882529.94498: done with get_vars() 27885 1726882529.94507: done getting variables 27885 1726882529.94558: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 21:35:29 -0400 (0:00:00.024) 0:00:02.588 ****** 27885 1726882529.94583: entering _queue_task() for managed_node2/command 27885 1726882529.95426: worker is 1 (out of 1 available) 27885 1726882529.95433: exiting _queue_task() for managed_node2/command 27885 1726882529.95444: done queuing things up, now waiting for results queue to drain 27885 1726882529.95445: waiting for pending results... 27885 1726882529.95900: running TaskExecutor() for managed_node2/TASK: Enable EPEL 7 27885 1726882529.95904: in run() - task 12673a56-9f93-3fa5-01be-0000000000f2 27885 1726882529.95908: variable 'ansible_search_path' from source: unknown 27885 1726882529.95910: variable 'ansible_search_path' from source: unknown 27885 1726882529.95913: calling self._execute() 27885 1726882529.96064: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882529.96077: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882529.96094: variable 'omit' from source: magic vars 27885 1726882529.96700: variable 'ansible_distribution' from source: facts 27885 1726882529.96823: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 27885 1726882529.97140: variable 'ansible_distribution_major_version' from source: facts 27885 1726882529.97151: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 27885 1726882529.97160: when evaluation is False, skipping this task 27885 1726882529.97167: _execute() done 27885 1726882529.97175: dumping result to json 27885 1726882529.97183: done dumping result, returning 27885 1726882529.97199: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 7 [12673a56-9f93-3fa5-01be-0000000000f2] 27885 1726882529.97210: sending task result for task 12673a56-9f93-3fa5-01be-0000000000f2 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 27885 1726882529.97403: no more pending results, returning what we have 27885 1726882529.97406: results queue empty 27885 1726882529.97407: checking for any_errors_fatal 27885 1726882529.97413: done checking for any_errors_fatal 27885 1726882529.97414: checking for max_fail_percentage 27885 1726882529.97415: done checking for max_fail_percentage 27885 1726882529.97416: checking to see if all hosts have failed and the running result is not ok 27885 1726882529.97417: done checking to see if all hosts have failed 27885 1726882529.97417: getting the remaining hosts for this loop 27885 1726882529.97419: done getting the remaining hosts for this loop 27885 1726882529.97423: getting the next task for host managed_node2 27885 1726882529.97430: done getting next task for host managed_node2 27885 1726882529.97432: ^ task is: TASK: Enable EPEL 8 27885 1726882529.97437: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882529.97441: getting variables 27885 1726882529.97442: in VariableManager get_vars() 27885 1726882529.97470: Calling all_inventory to load vars for managed_node2 27885 1726882529.97473: Calling groups_inventory to load vars for managed_node2 27885 1726882529.97477: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882529.97489: Calling all_plugins_play to load vars for managed_node2 27885 1726882529.97497: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882529.97501: Calling groups_plugins_play to load vars for managed_node2 27885 1726882529.98514: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882529.98902: done with get_vars() 27885 1726882529.98910: done getting variables 27885 1726882529.98937: done sending task result for task 12673a56-9f93-3fa5-01be-0000000000f2 27885 1726882529.98940: WORKER PROCESS EXITING 27885 1726882529.98972: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 21:35:29 -0400 (0:00:00.044) 0:00:02.632 ****** 27885 1726882529.99004: entering _queue_task() for managed_node2/command 27885 1726882529.99227: worker is 1 (out of 1 available) 27885 1726882529.99239: exiting _queue_task() for managed_node2/command 27885 1726882529.99250: done queuing things up, now waiting for results queue to drain 27885 1726882529.99252: waiting for pending results... 27885 1726882529.99522: running TaskExecutor() for managed_node2/TASK: Enable EPEL 8 27885 1726882529.99637: in run() - task 12673a56-9f93-3fa5-01be-0000000000f3 27885 1726882529.99656: variable 'ansible_search_path' from source: unknown 27885 1726882529.99664: variable 'ansible_search_path' from source: unknown 27885 1726882529.99705: calling self._execute() 27885 1726882529.99779: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882529.99798: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882529.99813: variable 'omit' from source: magic vars 27885 1726882530.00204: variable 'ansible_distribution' from source: facts 27885 1726882530.00222: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 27885 1726882530.00358: variable 'ansible_distribution_major_version' from source: facts 27885 1726882530.00374: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 27885 1726882530.00383: when evaluation is False, skipping this task 27885 1726882530.00392: _execute() done 27885 1726882530.00403: dumping result to json 27885 1726882530.00410: done dumping result, returning 27885 1726882530.00421: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 8 [12673a56-9f93-3fa5-01be-0000000000f3] 27885 1726882530.00431: sending task result for task 12673a56-9f93-3fa5-01be-0000000000f3 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 27885 1726882530.00570: no more pending results, returning what we have 27885 1726882530.00574: results queue empty 27885 1726882530.00575: checking for any_errors_fatal 27885 1726882530.00581: done checking for any_errors_fatal 27885 1726882530.00582: checking for max_fail_percentage 27885 1726882530.00583: done checking for max_fail_percentage 27885 1726882530.00584: checking to see if all hosts have failed and the running result is not ok 27885 1726882530.00585: done checking to see if all hosts have failed 27885 1726882530.00586: getting the remaining hosts for this loop 27885 1726882530.00587: done getting the remaining hosts for this loop 27885 1726882530.00595: getting the next task for host managed_node2 27885 1726882530.00604: done getting next task for host managed_node2 27885 1726882530.00607: ^ task is: TASK: Enable EPEL 6 27885 1726882530.00612: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882530.00615: getting variables 27885 1726882530.00617: in VariableManager get_vars() 27885 1726882530.00645: Calling all_inventory to load vars for managed_node2 27885 1726882530.00648: Calling groups_inventory to load vars for managed_node2 27885 1726882530.00652: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882530.00664: Calling all_plugins_play to load vars for managed_node2 27885 1726882530.00667: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882530.00670: Calling groups_plugins_play to load vars for managed_node2 27885 1726882530.01076: done sending task result for task 12673a56-9f93-3fa5-01be-0000000000f3 27885 1726882530.01079: WORKER PROCESS EXITING 27885 1726882530.01106: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882530.01312: done with get_vars() 27885 1726882530.01321: done getting variables 27885 1726882530.01375: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 21:35:30 -0400 (0:00:00.024) 0:00:02.656 ****** 27885 1726882530.01407: entering _queue_task() for managed_node2/copy 27885 1726882530.01625: worker is 1 (out of 1 available) 27885 1726882530.01636: exiting _queue_task() for managed_node2/copy 27885 1726882530.01647: done queuing things up, now waiting for results queue to drain 27885 1726882530.01648: waiting for pending results... 27885 1726882530.01880: running TaskExecutor() for managed_node2/TASK: Enable EPEL 6 27885 1726882530.01982: in run() - task 12673a56-9f93-3fa5-01be-0000000000f5 27885 1726882530.02011: variable 'ansible_search_path' from source: unknown 27885 1726882530.02020: variable 'ansible_search_path' from source: unknown 27885 1726882530.02056: calling self._execute() 27885 1726882530.02135: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882530.02147: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882530.02161: variable 'omit' from source: magic vars 27885 1726882530.02582: variable 'ansible_distribution' from source: facts 27885 1726882530.02607: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 27885 1726882530.02801: variable 'ansible_distribution_major_version' from source: facts 27885 1726882530.02814: Evaluated conditional (ansible_distribution_major_version == '6'): False 27885 1726882530.02823: when evaluation is False, skipping this task 27885 1726882530.02831: _execute() done 27885 1726882530.02839: dumping result to json 27885 1726882530.02847: done dumping result, returning 27885 1726882530.02858: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 6 [12673a56-9f93-3fa5-01be-0000000000f5] 27885 1726882530.02872: sending task result for task 12673a56-9f93-3fa5-01be-0000000000f5 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 27885 1726882530.03120: no more pending results, returning what we have 27885 1726882530.03123: results queue empty 27885 1726882530.03125: checking for any_errors_fatal 27885 1726882530.03129: done checking for any_errors_fatal 27885 1726882530.03130: checking for max_fail_percentage 27885 1726882530.03131: done checking for max_fail_percentage 27885 1726882530.03132: checking to see if all hosts have failed and the running result is not ok 27885 1726882530.03133: done checking to see if all hosts have failed 27885 1726882530.03133: getting the remaining hosts for this loop 27885 1726882530.03135: done getting the remaining hosts for this loop 27885 1726882530.03139: getting the next task for host managed_node2 27885 1726882530.03148: done getting next task for host managed_node2 27885 1726882530.03151: ^ task is: TASK: Set network provider to 'nm' 27885 1726882530.03154: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882530.03158: getting variables 27885 1726882530.03160: in VariableManager get_vars() 27885 1726882530.03188: Calling all_inventory to load vars for managed_node2 27885 1726882530.03195: Calling groups_inventory to load vars for managed_node2 27885 1726882530.03199: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882530.03212: Calling all_plugins_play to load vars for managed_node2 27885 1726882530.03215: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882530.03219: Calling groups_plugins_play to load vars for managed_node2 27885 1726882530.03533: done sending task result for task 12673a56-9f93-3fa5-01be-0000000000f5 27885 1726882530.03537: WORKER PROCESS EXITING 27885 1726882530.03558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882530.03761: done with get_vars() 27885 1726882530.03770: done getting variables 27885 1726882530.03827: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_device_nm.yml:13 Friday 20 September 2024 21:35:30 -0400 (0:00:00.024) 0:00:02.681 ****** 27885 1726882530.03853: entering _queue_task() for managed_node2/set_fact 27885 1726882530.04063: worker is 1 (out of 1 available) 27885 1726882530.04076: exiting _queue_task() for managed_node2/set_fact 27885 1726882530.04302: done queuing things up, now waiting for results queue to drain 27885 1726882530.04304: waiting for pending results... 27885 1726882530.04350: running TaskExecutor() for managed_node2/TASK: Set network provider to 'nm' 27885 1726882530.04438: in run() - task 12673a56-9f93-3fa5-01be-000000000007 27885 1726882530.04456: variable 'ansible_search_path' from source: unknown 27885 1726882530.04499: calling self._execute() 27885 1726882530.04573: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882530.04586: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882530.04605: variable 'omit' from source: magic vars 27885 1726882530.04711: variable 'omit' from source: magic vars 27885 1726882530.04751: variable 'omit' from source: magic vars 27885 1726882530.04795: variable 'omit' from source: magic vars 27885 1726882530.04839: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882530.04883: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882530.04912: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882530.04934: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882530.04948: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882530.04980: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882530.04987: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882530.04998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882530.05088: Set connection var ansible_pipelining to False 27885 1726882530.05103: Set connection var ansible_connection to ssh 27885 1726882530.05114: Set connection var ansible_timeout to 10 27885 1726882530.05121: Set connection var ansible_shell_type to sh 27885 1726882530.05131: Set connection var ansible_shell_executable to /bin/sh 27885 1726882530.05140: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882530.05166: variable 'ansible_shell_executable' from source: unknown 27885 1726882530.05175: variable 'ansible_connection' from source: unknown 27885 1726882530.05185: variable 'ansible_module_compression' from source: unknown 27885 1726882530.05197: variable 'ansible_shell_type' from source: unknown 27885 1726882530.05205: variable 'ansible_shell_executable' from source: unknown 27885 1726882530.05213: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882530.05221: variable 'ansible_pipelining' from source: unknown 27885 1726882530.05227: variable 'ansible_timeout' from source: unknown 27885 1726882530.05234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882530.05400: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882530.05404: variable 'omit' from source: magic vars 27885 1726882530.05406: starting attempt loop 27885 1726882530.05408: running the handler 27885 1726882530.05411: handler run complete 27885 1726882530.05425: attempt loop complete, returning result 27885 1726882530.05432: _execute() done 27885 1726882530.05439: dumping result to json 27885 1726882530.05447: done dumping result, returning 27885 1726882530.05498: done running TaskExecutor() for managed_node2/TASK: Set network provider to 'nm' [12673a56-9f93-3fa5-01be-000000000007] 27885 1726882530.05501: sending task result for task 12673a56-9f93-3fa5-01be-000000000007 27885 1726882530.05562: done sending task result for task 12673a56-9f93-3fa5-01be-000000000007 27885 1726882530.05566: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 27885 1726882530.05659: no more pending results, returning what we have 27885 1726882530.05662: results queue empty 27885 1726882530.05663: checking for any_errors_fatal 27885 1726882530.05666: done checking for any_errors_fatal 27885 1726882530.05666: checking for max_fail_percentage 27885 1726882530.05668: done checking for max_fail_percentage 27885 1726882530.05668: checking to see if all hosts have failed and the running result is not ok 27885 1726882530.05669: done checking to see if all hosts have failed 27885 1726882530.05670: getting the remaining hosts for this loop 27885 1726882530.05671: done getting the remaining hosts for this loop 27885 1726882530.05674: getting the next task for host managed_node2 27885 1726882530.05679: done getting next task for host managed_node2 27885 1726882530.05681: ^ task is: TASK: meta (flush_handlers) 27885 1726882530.05683: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882530.05687: getting variables 27885 1726882530.05688: in VariableManager get_vars() 27885 1726882530.05726: Calling all_inventory to load vars for managed_node2 27885 1726882530.05728: Calling groups_inventory to load vars for managed_node2 27885 1726882530.05731: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882530.05742: Calling all_plugins_play to load vars for managed_node2 27885 1726882530.05744: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882530.05747: Calling groups_plugins_play to load vars for managed_node2 27885 1726882530.06061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882530.06285: done with get_vars() 27885 1726882530.06298: done getting variables 27885 1726882530.06354: in VariableManager get_vars() 27885 1726882530.06362: Calling all_inventory to load vars for managed_node2 27885 1726882530.06364: Calling groups_inventory to load vars for managed_node2 27885 1726882530.06366: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882530.06370: Calling all_plugins_play to load vars for managed_node2 27885 1726882530.06372: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882530.06375: Calling groups_plugins_play to load vars for managed_node2 27885 1726882530.06548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882530.06752: done with get_vars() 27885 1726882530.06765: done queuing things up, now waiting for results queue to drain 27885 1726882530.06767: results queue empty 27885 1726882530.06768: checking for any_errors_fatal 27885 1726882530.06770: done checking for any_errors_fatal 27885 1726882530.06770: checking for max_fail_percentage 27885 1726882530.06771: done checking for max_fail_percentage 27885 1726882530.06772: checking to see if all hosts have failed and the running result is not ok 27885 1726882530.06772: done checking to see if all hosts have failed 27885 1726882530.06773: getting the remaining hosts for this loop 27885 1726882530.06774: done getting the remaining hosts for this loop 27885 1726882530.06776: getting the next task for host managed_node2 27885 1726882530.06779: done getting next task for host managed_node2 27885 1726882530.06780: ^ task is: TASK: meta (flush_handlers) 27885 1726882530.06782: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882530.06788: getting variables 27885 1726882530.06789: in VariableManager get_vars() 27885 1726882530.06800: Calling all_inventory to load vars for managed_node2 27885 1726882530.06802: Calling groups_inventory to load vars for managed_node2 27885 1726882530.06804: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882530.06808: Calling all_plugins_play to load vars for managed_node2 27885 1726882530.06810: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882530.06813: Calling groups_plugins_play to load vars for managed_node2 27885 1726882530.06959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882530.07161: done with get_vars() 27885 1726882530.07168: done getting variables 27885 1726882530.07213: in VariableManager get_vars() 27885 1726882530.07222: Calling all_inventory to load vars for managed_node2 27885 1726882530.07224: Calling groups_inventory to load vars for managed_node2 27885 1726882530.07226: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882530.07230: Calling all_plugins_play to load vars for managed_node2 27885 1726882530.07232: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882530.07235: Calling groups_plugins_play to load vars for managed_node2 27885 1726882530.07406: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882530.07647: done with get_vars() 27885 1726882530.07658: done queuing things up, now waiting for results queue to drain 27885 1726882530.07659: results queue empty 27885 1726882530.07660: checking for any_errors_fatal 27885 1726882530.07661: done checking for any_errors_fatal 27885 1726882530.07662: checking for max_fail_percentage 27885 1726882530.07663: done checking for max_fail_percentage 27885 1726882530.07664: checking to see if all hosts have failed and the running result is not ok 27885 1726882530.07664: done checking to see if all hosts have failed 27885 1726882530.07665: getting the remaining hosts for this loop 27885 1726882530.07666: done getting the remaining hosts for this loop 27885 1726882530.07668: getting the next task for host managed_node2 27885 1726882530.07670: done getting next task for host managed_node2 27885 1726882530.07671: ^ task is: None 27885 1726882530.07672: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882530.07674: done queuing things up, now waiting for results queue to drain 27885 1726882530.07674: results queue empty 27885 1726882530.07675: checking for any_errors_fatal 27885 1726882530.07676: done checking for any_errors_fatal 27885 1726882530.07676: checking for max_fail_percentage 27885 1726882530.07677: done checking for max_fail_percentage 27885 1726882530.07678: checking to see if all hosts have failed and the running result is not ok 27885 1726882530.07678: done checking to see if all hosts have failed 27885 1726882530.07680: getting the next task for host managed_node2 27885 1726882530.07682: done getting next task for host managed_node2 27885 1726882530.07683: ^ task is: None 27885 1726882530.07684: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882530.07730: in VariableManager get_vars() 27885 1726882530.07753: done with get_vars() 27885 1726882530.07758: in VariableManager get_vars() 27885 1726882530.07772: done with get_vars() 27885 1726882530.07776: variable 'omit' from source: magic vars 27885 1726882530.07812: in VariableManager get_vars() 27885 1726882530.07829: done with get_vars() 27885 1726882530.07852: variable 'omit' from source: magic vars PLAY [Test output device of routes] ******************************************** 27885 1726882530.08310: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 27885 1726882530.08334: getting the remaining hosts for this loop 27885 1726882530.08335: done getting the remaining hosts for this loop 27885 1726882530.08337: getting the next task for host managed_node2 27885 1726882530.08340: done getting next task for host managed_node2 27885 1726882530.08341: ^ task is: TASK: Gathering Facts 27885 1726882530.08343: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882530.08344: getting variables 27885 1726882530.08345: in VariableManager get_vars() 27885 1726882530.08357: Calling all_inventory to load vars for managed_node2 27885 1726882530.08359: Calling groups_inventory to load vars for managed_node2 27885 1726882530.08361: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882530.08366: Calling all_plugins_play to load vars for managed_node2 27885 1726882530.08378: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882530.08381: Calling groups_plugins_play to load vars for managed_node2 27885 1726882530.08535: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882530.08783: done with get_vars() 27885 1726882530.08795: done getting variables 27885 1726882530.08832: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:3 Friday 20 September 2024 21:35:30 -0400 (0:00:00.049) 0:00:02.731 ****** 27885 1726882530.08854: entering _queue_task() for managed_node2/gather_facts 27885 1726882530.09312: worker is 1 (out of 1 available) 27885 1726882530.09319: exiting _queue_task() for managed_node2/gather_facts 27885 1726882530.09329: done queuing things up, now waiting for results queue to drain 27885 1726882530.09330: waiting for pending results... 27885 1726882530.09347: running TaskExecutor() for managed_node2/TASK: Gathering Facts 27885 1726882530.09444: in run() - task 12673a56-9f93-3fa5-01be-00000000011b 27885 1726882530.09467: variable 'ansible_search_path' from source: unknown 27885 1726882530.09509: calling self._execute() 27885 1726882530.09587: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882530.09669: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882530.09674: variable 'omit' from source: magic vars 27885 1726882530.09978: variable 'ansible_distribution_major_version' from source: facts 27885 1726882530.10003: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882530.10013: variable 'omit' from source: magic vars 27885 1726882530.10039: variable 'omit' from source: magic vars 27885 1726882530.10076: variable 'omit' from source: magic vars 27885 1726882530.10125: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882530.10164: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882530.10187: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882530.10216: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882530.10232: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882530.10263: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882530.10323: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882530.10326: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882530.10384: Set connection var ansible_pipelining to False 27885 1726882530.10399: Set connection var ansible_connection to ssh 27885 1726882530.10409: Set connection var ansible_timeout to 10 27885 1726882530.10416: Set connection var ansible_shell_type to sh 27885 1726882530.10428: Set connection var ansible_shell_executable to /bin/sh 27885 1726882530.10441: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882530.10466: variable 'ansible_shell_executable' from source: unknown 27885 1726882530.10473: variable 'ansible_connection' from source: unknown 27885 1726882530.10480: variable 'ansible_module_compression' from source: unknown 27885 1726882530.10544: variable 'ansible_shell_type' from source: unknown 27885 1726882530.10547: variable 'ansible_shell_executable' from source: unknown 27885 1726882530.10549: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882530.10551: variable 'ansible_pipelining' from source: unknown 27885 1726882530.10553: variable 'ansible_timeout' from source: unknown 27885 1726882530.10555: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882530.10694: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882530.10709: variable 'omit' from source: magic vars 27885 1726882530.10718: starting attempt loop 27885 1726882530.10724: running the handler 27885 1726882530.10742: variable 'ansible_facts' from source: unknown 27885 1726882530.10769: _low_level_execute_command(): starting 27885 1726882530.10780: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27885 1726882530.11619: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882530.11625: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882530.11660: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882530.11708: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882530.11726: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882530.11874: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882530.13514: stdout chunk (state=3): >>>/root <<< 27885 1726882530.13711: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882530.13715: stdout chunk (state=3): >>><<< 27885 1726882530.13718: stderr chunk (state=3): >>><<< 27885 1726882530.14012: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882530.14016: _low_level_execute_command(): starting 27885 1726882530.14020: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882530.1382067-28032-73674693559548 `" && echo ansible-tmp-1726882530.1382067-28032-73674693559548="` echo /root/.ansible/tmp/ansible-tmp-1726882530.1382067-28032-73674693559548 `" ) && sleep 0' 27885 1726882530.15300: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882530.15711: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882530.15779: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882530.15876: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882530.17742: stdout chunk (state=3): >>>ansible-tmp-1726882530.1382067-28032-73674693559548=/root/.ansible/tmp/ansible-tmp-1726882530.1382067-28032-73674693559548 <<< 27885 1726882530.17889: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882530.17906: stdout chunk (state=3): >>><<< 27885 1726882530.17919: stderr chunk (state=3): >>><<< 27885 1726882530.17940: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882530.1382067-28032-73674693559548=/root/.ansible/tmp/ansible-tmp-1726882530.1382067-28032-73674693559548 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882530.17974: variable 'ansible_module_compression' from source: unknown 27885 1726882530.18144: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27885_1t3g5hz/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 27885 1726882530.18246: variable 'ansible_facts' from source: unknown 27885 1726882530.18755: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882530.1382067-28032-73674693559548/AnsiballZ_setup.py 27885 1726882530.19198: Sending initial data 27885 1726882530.19244: Sent initial data (153 bytes) 27885 1726882530.20312: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882530.20337: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882530.20415: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882530.21983: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27885 1726882530.22040: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27885 1726882530.22104: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmp4_c5clv4 /root/.ansible/tmp/ansible-tmp-1726882530.1382067-28032-73674693559548/AnsiballZ_setup.py <<< 27885 1726882530.22216: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882530.1382067-28032-73674693559548/AnsiballZ_setup.py" <<< 27885 1726882530.22248: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmp4_c5clv4" to remote "/root/.ansible/tmp/ansible-tmp-1726882530.1382067-28032-73674693559548/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882530.1382067-28032-73674693559548/AnsiballZ_setup.py" <<< 27885 1726882530.24437: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882530.24448: stdout chunk (state=3): >>><<< 27885 1726882530.24458: stderr chunk (state=3): >>><<< 27885 1726882530.24480: done transferring module to remote 27885 1726882530.24568: _low_level_execute_command(): starting 27885 1726882530.24571: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882530.1382067-28032-73674693559548/ /root/.ansible/tmp/ansible-tmp-1726882530.1382067-28032-73674693559548/AnsiballZ_setup.py && sleep 0' 27885 1726882530.25101: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882530.25104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 27885 1726882530.25107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 27885 1726882530.25109: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882530.25111: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882530.25180: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882530.25241: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882530.27123: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882530.27127: stdout chunk (state=3): >>><<< 27885 1726882530.27130: stderr chunk (state=3): >>><<< 27885 1726882530.27133: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882530.27139: _low_level_execute_command(): starting 27885 1726882530.27142: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882530.1382067-28032-73674693559548/AnsiballZ_setup.py && sleep 0' 27885 1726882530.27747: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882530.27785: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882530.27880: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882530.27931: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882530.27995: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882530.90225: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_fibre_channel_wwn": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_is_chroot": false, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-69", "ansible_nodename": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273daf4d79783f5cba36df2f56d9d0", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_local": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDO9PZgr9JLdptbX1z24dINsp1ZUviCn2IFYUqfMM6j/uCKMg5pVfDr5EP5Ea09xR+KKjE9W6h445mjrxTxfVC3xCHR3VpSw3Oq+2ut1Ji+loZ+gygWU601w94ai/xsdgyml1uEyWaA+y3goILZNio8q0yQtVVMKaylDdwXYQ2zefxhpEJ2IlB2HJcJzSxCYz+Sa3mdkfG2DlXy2tqo95KEZ2m7lxzM1pkAHXup+mi3WaH4b4fHxNlRo8S/ebtmXiUYGjymQ5jck8sol0xo4LeBCRe0NKWBJZmK4X6N7Vwrb9tSp9rBJYxjQA9YCszz8i2C3Q33fP+kP2NUonq0NfFciCOt026ERL+ygggM392iXVJPF3VZfX1Pi3Z6B1PbuFZy/UE0SpwxHjWy+QRHd/SVa4YK0V3bMQ3T0bvGI2UuujjRvmDoob7j8Q4QkyY73p60sv4iob7xx/5BBlSagZNKbPiUWhOPXkHgYguuEWrbvoeQUPjhtCzQXguvY0Y6U18=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOkVDo8QW6ai2hAn3+uCY59f9/ff9I0xJwsgAdLmXdfM6LXa2YZqxM/XbCey2xlDC6ejVLDU0902Xq19HWz8n48=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIMO17OwTe9G3GI2fp+men+Q6jlxYO58zd3fpAMZ6aHgk", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_iscsi_iqn": "", "ansible_loadavg": {"1m": 0.5537109375, "5m": 0.4970703125, "15m": 0.27490234375}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "35", "second": "30", "epoch": "1726882530", "epoch_int": "1726882530", "date": "2024-09-20", "time": "21:35:30", "iso8601_micro": "2024-09-21T01:35:30.547801Z", "iso8601": "2024-09-21T01:35:30Z", "iso8601_basic": "20240920T213530547801", "iso8601_basic_short": "20240920T213530", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_interfaces": ["eth0", "lo", "rpltstbr"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c1:46:63:3b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.69", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c1ff:fe46:633b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "6e:57:f6:54:9a:30", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.69", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c1:46:63:3b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.69", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::8ff:c1ff:fe46:633b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.69", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::8ff:c1ff:fe46:633b"]}, "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2933, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 598, "free": 2933}, "nocache": {"free": 3272, "used": 259}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273daf-4d79-783f-5cba-36df2f56d9d0", "ansible_product_uuid": "ec273daf-4d79-783f-5cba-36df2f56d9d0", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 720, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794603008, "block_size": 4096, "block_total": 65519099, "block_available": 63914698, "block_used": 1604401, "inode_total": 131070960, "inode_available": 131029051, "inode_used": 41909, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 35334 10.31.14.69 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 35334 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 27885 1726882530.92001: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882530.92014: stderr chunk (state=3): >>>Shared connection to 10.31.14.69 closed. <<< 27885 1726882530.92062: stderr chunk (state=3): >>><<< 27885 1726882530.92111: stdout chunk (state=3): >>><<< 27885 1726882530.92173: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_fibre_channel_wwn": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_is_chroot": false, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-69", "ansible_nodename": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273daf4d79783f5cba36df2f56d9d0", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_local": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDO9PZgr9JLdptbX1z24dINsp1ZUviCn2IFYUqfMM6j/uCKMg5pVfDr5EP5Ea09xR+KKjE9W6h445mjrxTxfVC3xCHR3VpSw3Oq+2ut1Ji+loZ+gygWU601w94ai/xsdgyml1uEyWaA+y3goILZNio8q0yQtVVMKaylDdwXYQ2zefxhpEJ2IlB2HJcJzSxCYz+Sa3mdkfG2DlXy2tqo95KEZ2m7lxzM1pkAHXup+mi3WaH4b4fHxNlRo8S/ebtmXiUYGjymQ5jck8sol0xo4LeBCRe0NKWBJZmK4X6N7Vwrb9tSp9rBJYxjQA9YCszz8i2C3Q33fP+kP2NUonq0NfFciCOt026ERL+ygggM392iXVJPF3VZfX1Pi3Z6B1PbuFZy/UE0SpwxHjWy+QRHd/SVa4YK0V3bMQ3T0bvGI2UuujjRvmDoob7j8Q4QkyY73p60sv4iob7xx/5BBlSagZNKbPiUWhOPXkHgYguuEWrbvoeQUPjhtCzQXguvY0Y6U18=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOkVDo8QW6ai2hAn3+uCY59f9/ff9I0xJwsgAdLmXdfM6LXa2YZqxM/XbCey2xlDC6ejVLDU0902Xq19HWz8n48=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIMO17OwTe9G3GI2fp+men+Q6jlxYO58zd3fpAMZ6aHgk", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_iscsi_iqn": "", "ansible_loadavg": {"1m": 0.5537109375, "5m": 0.4970703125, "15m": 0.27490234375}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "35", "second": "30", "epoch": "1726882530", "epoch_int": "1726882530", "date": "2024-09-20", "time": "21:35:30", "iso8601_micro": "2024-09-21T01:35:30.547801Z", "iso8601": "2024-09-21T01:35:30Z", "iso8601_basic": "20240920T213530547801", "iso8601_basic_short": "20240920T213530", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_interfaces": ["eth0", "lo", "rpltstbr"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c1:46:63:3b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.69", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c1ff:fe46:633b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "6e:57:f6:54:9a:30", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.69", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c1:46:63:3b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.69", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::8ff:c1ff:fe46:633b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.69", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::8ff:c1ff:fe46:633b"]}, "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2933, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 598, "free": 2933}, "nocache": {"free": 3272, "used": 259}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273daf-4d79-783f-5cba-36df2f56d9d0", "ansible_product_uuid": "ec273daf-4d79-783f-5cba-36df2f56d9d0", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 720, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794603008, "block_size": 4096, "block_total": 65519099, "block_available": 63914698, "block_used": 1604401, "inode_total": 131070960, "inode_available": 131029051, "inode_used": 41909, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 35334 10.31.14.69 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 35334 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 27885 1726882530.92575: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882530.1382067-28032-73674693559548/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27885 1726882530.92609: _low_level_execute_command(): starting 27885 1726882530.92619: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882530.1382067-28032-73674693559548/ > /dev/null 2>&1 && sleep 0' 27885 1726882530.93189: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882530.93209: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882530.93226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882530.93245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882530.93261: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882530.93273: stderr chunk (state=3): >>>debug2: match not found <<< 27885 1726882530.93372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882530.93383: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882530.93480: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882530.95298: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882530.95350: stderr chunk (state=3): >>><<< 27885 1726882530.95358: stdout chunk (state=3): >>><<< 27885 1726882530.95377: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882530.95388: handler run complete 27885 1726882530.95548: variable 'ansible_facts' from source: unknown 27885 1726882530.95661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882530.96022: variable 'ansible_facts' from source: unknown 27885 1726882530.96123: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882530.96264: attempt loop complete, returning result 27885 1726882530.96275: _execute() done 27885 1726882530.96286: dumping result to json 27885 1726882530.96498: done dumping result, returning 27885 1726882530.96501: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [12673a56-9f93-3fa5-01be-00000000011b] 27885 1726882530.96502: sending task result for task 12673a56-9f93-3fa5-01be-00000000011b 27885 1726882530.96875: done sending task result for task 12673a56-9f93-3fa5-01be-00000000011b 27885 1726882530.96879: WORKER PROCESS EXITING ok: [managed_node2] 27885 1726882530.97551: no more pending results, returning what we have 27885 1726882530.97556: results queue empty 27885 1726882530.97557: checking for any_errors_fatal 27885 1726882530.97558: done checking for any_errors_fatal 27885 1726882530.97559: checking for max_fail_percentage 27885 1726882530.97563: done checking for max_fail_percentage 27885 1726882530.97564: checking to see if all hosts have failed and the running result is not ok 27885 1726882530.97564: done checking to see if all hosts have failed 27885 1726882530.97565: getting the remaining hosts for this loop 27885 1726882530.97566: done getting the remaining hosts for this loop 27885 1726882530.97570: getting the next task for host managed_node2 27885 1726882530.97575: done getting next task for host managed_node2 27885 1726882530.97577: ^ task is: TASK: meta (flush_handlers) 27885 1726882530.97579: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882530.97583: getting variables 27885 1726882530.97585: in VariableManager get_vars() 27885 1726882530.97620: Calling all_inventory to load vars for managed_node2 27885 1726882530.97623: Calling groups_inventory to load vars for managed_node2 27885 1726882530.97626: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882530.97639: Calling all_plugins_play to load vars for managed_node2 27885 1726882530.97642: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882530.97647: Calling groups_plugins_play to load vars for managed_node2 27885 1726882530.97902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882530.98567: done with get_vars() 27885 1726882530.98580: done getting variables 27885 1726882530.98653: in VariableManager get_vars() 27885 1726882530.98667: Calling all_inventory to load vars for managed_node2 27885 1726882530.98669: Calling groups_inventory to load vars for managed_node2 27885 1726882530.98671: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882530.98676: Calling all_plugins_play to load vars for managed_node2 27885 1726882530.98678: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882530.98680: Calling groups_plugins_play to load vars for managed_node2 27885 1726882530.99386: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882531.00031: done with get_vars() 27885 1726882531.00068: done queuing things up, now waiting for results queue to drain 27885 1726882531.00070: results queue empty 27885 1726882531.00071: checking for any_errors_fatal 27885 1726882531.00074: done checking for any_errors_fatal 27885 1726882531.00075: checking for max_fail_percentage 27885 1726882531.00079: done checking for max_fail_percentage 27885 1726882531.00079: checking to see if all hosts have failed and the running result is not ok 27885 1726882531.00080: done checking to see if all hosts have failed 27885 1726882531.00081: getting the remaining hosts for this loop 27885 1726882531.00081: done getting the remaining hosts for this loop 27885 1726882531.00084: getting the next task for host managed_node2 27885 1726882531.00087: done getting next task for host managed_node2 27885 1726882531.00089: ^ task is: TASK: Set type and interface0 27885 1726882531.00090: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882531.00092: getting variables 27885 1726882531.00094: in VariableManager get_vars() 27885 1726882531.00106: Calling all_inventory to load vars for managed_node2 27885 1726882531.00108: Calling groups_inventory to load vars for managed_node2 27885 1726882531.00110: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882531.00114: Calling all_plugins_play to load vars for managed_node2 27885 1726882531.00116: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882531.00118: Calling groups_plugins_play to load vars for managed_node2 27885 1726882531.00353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882531.00876: done with get_vars() 27885 1726882531.00885: done getting variables 27885 1726882531.00932: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set type and interface0] ************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:11 Friday 20 September 2024 21:35:31 -0400 (0:00:00.921) 0:00:03.652 ****** 27885 1726882531.00962: entering _queue_task() for managed_node2/set_fact 27885 1726882531.01430: worker is 1 (out of 1 available) 27885 1726882531.01485: exiting _queue_task() for managed_node2/set_fact 27885 1726882531.01501: done queuing things up, now waiting for results queue to drain 27885 1726882531.01502: waiting for pending results... 27885 1726882531.01821: running TaskExecutor() for managed_node2/TASK: Set type and interface0 27885 1726882531.02103: in run() - task 12673a56-9f93-3fa5-01be-00000000000b 27885 1726882531.02124: variable 'ansible_search_path' from source: unknown 27885 1726882531.02164: calling self._execute() 27885 1726882531.02397: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882531.02420: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882531.02435: variable 'omit' from source: magic vars 27885 1726882531.02980: variable 'ansible_distribution_major_version' from source: facts 27885 1726882531.03001: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882531.03011: variable 'omit' from source: magic vars 27885 1726882531.03040: variable 'omit' from source: magic vars 27885 1726882531.03077: variable 'type' from source: play vars 27885 1726882531.03155: variable 'type' from source: play vars 27885 1726882531.03186: variable 'interface0' from source: play vars 27885 1726882531.03259: variable 'interface0' from source: play vars 27885 1726882531.03283: variable 'omit' from source: magic vars 27885 1726882531.03328: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882531.03370: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882531.03401: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882531.03424: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882531.03444: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882531.03494: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882531.03498: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882531.03500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882531.03797: Set connection var ansible_pipelining to False 27885 1726882531.03800: Set connection var ansible_connection to ssh 27885 1726882531.03802: Set connection var ansible_timeout to 10 27885 1726882531.03804: Set connection var ansible_shell_type to sh 27885 1726882531.03805: Set connection var ansible_shell_executable to /bin/sh 27885 1726882531.03807: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882531.03809: variable 'ansible_shell_executable' from source: unknown 27885 1726882531.03810: variable 'ansible_connection' from source: unknown 27885 1726882531.03812: variable 'ansible_module_compression' from source: unknown 27885 1726882531.03813: variable 'ansible_shell_type' from source: unknown 27885 1726882531.03814: variable 'ansible_shell_executable' from source: unknown 27885 1726882531.03816: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882531.03817: variable 'ansible_pipelining' from source: unknown 27885 1726882531.03819: variable 'ansible_timeout' from source: unknown 27885 1726882531.03821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882531.03941: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882531.03959: variable 'omit' from source: magic vars 27885 1726882531.03969: starting attempt loop 27885 1726882531.03976: running the handler 27885 1726882531.03990: handler run complete 27885 1726882531.04007: attempt loop complete, returning result 27885 1726882531.04014: _execute() done 27885 1726882531.04022: dumping result to json 27885 1726882531.04029: done dumping result, returning 27885 1726882531.04045: done running TaskExecutor() for managed_node2/TASK: Set type and interface0 [12673a56-9f93-3fa5-01be-00000000000b] 27885 1726882531.04059: sending task result for task 12673a56-9f93-3fa5-01be-00000000000b ok: [managed_node2] => { "ansible_facts": { "interface": "ethtest0", "type": "veth" }, "changed": false } 27885 1726882531.04214: no more pending results, returning what we have 27885 1726882531.04217: results queue empty 27885 1726882531.04218: checking for any_errors_fatal 27885 1726882531.04219: done checking for any_errors_fatal 27885 1726882531.04220: checking for max_fail_percentage 27885 1726882531.04221: done checking for max_fail_percentage 27885 1726882531.04222: checking to see if all hosts have failed and the running result is not ok 27885 1726882531.04223: done checking to see if all hosts have failed 27885 1726882531.04223: getting the remaining hosts for this loop 27885 1726882531.04225: done getting the remaining hosts for this loop 27885 1726882531.04228: getting the next task for host managed_node2 27885 1726882531.04234: done getting next task for host managed_node2 27885 1726882531.04236: ^ task is: TASK: Show interfaces 27885 1726882531.04238: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882531.04241: getting variables 27885 1726882531.04243: in VariableManager get_vars() 27885 1726882531.04284: Calling all_inventory to load vars for managed_node2 27885 1726882531.04287: Calling groups_inventory to load vars for managed_node2 27885 1726882531.04289: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882531.04300: Calling all_plugins_play to load vars for managed_node2 27885 1726882531.04303: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882531.04305: Calling groups_plugins_play to load vars for managed_node2 27885 1726882531.04912: done sending task result for task 12673a56-9f93-3fa5-01be-00000000000b 27885 1726882531.04915: WORKER PROCESS EXITING 27885 1726882531.04962: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882531.05765: done with get_vars() 27885 1726882531.05775: done getting variables TASK [Show interfaces] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:15 Friday 20 September 2024 21:35:31 -0400 (0:00:00.050) 0:00:03.702 ****** 27885 1726882531.05974: entering _queue_task() for managed_node2/include_tasks 27885 1726882531.06516: worker is 1 (out of 1 available) 27885 1726882531.06528: exiting _queue_task() for managed_node2/include_tasks 27885 1726882531.06539: done queuing things up, now waiting for results queue to drain 27885 1726882531.06541: waiting for pending results... 27885 1726882531.07082: running TaskExecutor() for managed_node2/TASK: Show interfaces 27885 1726882531.07206: in run() - task 12673a56-9f93-3fa5-01be-00000000000c 27885 1726882531.07288: variable 'ansible_search_path' from source: unknown 27885 1726882531.07350: calling self._execute() 27885 1726882531.07487: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882531.07490: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882531.07501: variable 'omit' from source: magic vars 27885 1726882531.07877: variable 'ansible_distribution_major_version' from source: facts 27885 1726882531.07985: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882531.07989: _execute() done 27885 1726882531.07992: dumping result to json 27885 1726882531.07996: done dumping result, returning 27885 1726882531.07999: done running TaskExecutor() for managed_node2/TASK: Show interfaces [12673a56-9f93-3fa5-01be-00000000000c] 27885 1726882531.08001: sending task result for task 12673a56-9f93-3fa5-01be-00000000000c 27885 1726882531.08075: done sending task result for task 12673a56-9f93-3fa5-01be-00000000000c 27885 1726882531.08078: WORKER PROCESS EXITING 27885 1726882531.08108: no more pending results, returning what we have 27885 1726882531.08114: in VariableManager get_vars() 27885 1726882531.08161: Calling all_inventory to load vars for managed_node2 27885 1726882531.08164: Calling groups_inventory to load vars for managed_node2 27885 1726882531.08167: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882531.08179: Calling all_plugins_play to load vars for managed_node2 27885 1726882531.08183: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882531.08186: Calling groups_plugins_play to load vars for managed_node2 27885 1726882531.08917: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882531.09267: done with get_vars() 27885 1726882531.09282: variable 'ansible_search_path' from source: unknown 27885 1726882531.09297: we have included files to process 27885 1726882531.09299: generating all_blocks data 27885 1726882531.09300: done generating all_blocks data 27885 1726882531.09301: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 27885 1726882531.09302: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 27885 1726882531.09304: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 27885 1726882531.09457: in VariableManager get_vars() 27885 1726882531.09479: done with get_vars() 27885 1726882531.09589: done processing included file 27885 1726882531.09591: iterating over new_blocks loaded from include file 27885 1726882531.09594: in VariableManager get_vars() 27885 1726882531.09640: done with get_vars() 27885 1726882531.09642: filtering new block on tags 27885 1726882531.09659: done filtering new block on tags 27885 1726882531.09661: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 27885 1726882531.09666: extending task lists for all hosts with included blocks 27885 1726882531.09811: done extending task lists 27885 1726882531.09813: done processing included files 27885 1726882531.09814: results queue empty 27885 1726882531.09814: checking for any_errors_fatal 27885 1726882531.09822: done checking for any_errors_fatal 27885 1726882531.09823: checking for max_fail_percentage 27885 1726882531.09824: done checking for max_fail_percentage 27885 1726882531.09825: checking to see if all hosts have failed and the running result is not ok 27885 1726882531.09825: done checking to see if all hosts have failed 27885 1726882531.09826: getting the remaining hosts for this loop 27885 1726882531.09827: done getting the remaining hosts for this loop 27885 1726882531.09830: getting the next task for host managed_node2 27885 1726882531.09834: done getting next task for host managed_node2 27885 1726882531.09836: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 27885 1726882531.09839: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882531.09841: getting variables 27885 1726882531.09842: in VariableManager get_vars() 27885 1726882531.09854: Calling all_inventory to load vars for managed_node2 27885 1726882531.09856: Calling groups_inventory to load vars for managed_node2 27885 1726882531.09858: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882531.09863: Calling all_plugins_play to load vars for managed_node2 27885 1726882531.09865: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882531.09868: Calling groups_plugins_play to load vars for managed_node2 27885 1726882531.10019: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882531.10388: done with get_vars() 27885 1726882531.10400: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:35:31 -0400 (0:00:00.045) 0:00:03.747 ****** 27885 1726882531.10538: entering _queue_task() for managed_node2/include_tasks 27885 1726882531.10871: worker is 1 (out of 1 available) 27885 1726882531.10885: exiting _queue_task() for managed_node2/include_tasks 27885 1726882531.10900: done queuing things up, now waiting for results queue to drain 27885 1726882531.10901: waiting for pending results... 27885 1726882531.11438: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 27885 1726882531.11442: in run() - task 12673a56-9f93-3fa5-01be-000000000135 27885 1726882531.11444: variable 'ansible_search_path' from source: unknown 27885 1726882531.11448: variable 'ansible_search_path' from source: unknown 27885 1726882531.11451: calling self._execute() 27885 1726882531.11532: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882531.11569: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882531.11575: variable 'omit' from source: magic vars 27885 1726882531.12343: variable 'ansible_distribution_major_version' from source: facts 27885 1726882531.12433: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882531.12438: _execute() done 27885 1726882531.12441: dumping result to json 27885 1726882531.12443: done dumping result, returning 27885 1726882531.12505: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [12673a56-9f93-3fa5-01be-000000000135] 27885 1726882531.12509: sending task result for task 12673a56-9f93-3fa5-01be-000000000135 27885 1726882531.12802: no more pending results, returning what we have 27885 1726882531.12807: in VariableManager get_vars() 27885 1726882531.12857: Calling all_inventory to load vars for managed_node2 27885 1726882531.12861: Calling groups_inventory to load vars for managed_node2 27885 1726882531.12863: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882531.12877: Calling all_plugins_play to load vars for managed_node2 27885 1726882531.12880: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882531.12883: Calling groups_plugins_play to load vars for managed_node2 27885 1726882531.13269: done sending task result for task 12673a56-9f93-3fa5-01be-000000000135 27885 1726882531.13272: WORKER PROCESS EXITING 27885 1726882531.13295: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882531.13746: done with get_vars() 27885 1726882531.13754: variable 'ansible_search_path' from source: unknown 27885 1726882531.13756: variable 'ansible_search_path' from source: unknown 27885 1726882531.13864: we have included files to process 27885 1726882531.13865: generating all_blocks data 27885 1726882531.13866: done generating all_blocks data 27885 1726882531.13868: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 27885 1726882531.13869: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 27885 1726882531.13871: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 27885 1726882531.14489: done processing included file 27885 1726882531.14491: iterating over new_blocks loaded from include file 27885 1726882531.14494: in VariableManager get_vars() 27885 1726882531.14526: done with get_vars() 27885 1726882531.14528: filtering new block on tags 27885 1726882531.14567: done filtering new block on tags 27885 1726882531.14574: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 27885 1726882531.14579: extending task lists for all hosts with included blocks 27885 1726882531.14747: done extending task lists 27885 1726882531.14748: done processing included files 27885 1726882531.14749: results queue empty 27885 1726882531.14750: checking for any_errors_fatal 27885 1726882531.14759: done checking for any_errors_fatal 27885 1726882531.14761: checking for max_fail_percentage 27885 1726882531.14762: done checking for max_fail_percentage 27885 1726882531.14770: checking to see if all hosts have failed and the running result is not ok 27885 1726882531.14772: done checking to see if all hosts have failed 27885 1726882531.14772: getting the remaining hosts for this loop 27885 1726882531.14773: done getting the remaining hosts for this loop 27885 1726882531.14776: getting the next task for host managed_node2 27885 1726882531.14797: done getting next task for host managed_node2 27885 1726882531.14801: ^ task is: TASK: Gather current interface info 27885 1726882531.14804: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882531.14806: getting variables 27885 1726882531.14826: in VariableManager get_vars() 27885 1726882531.14852: Calling all_inventory to load vars for managed_node2 27885 1726882531.14855: Calling groups_inventory to load vars for managed_node2 27885 1726882531.14857: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882531.14861: Calling all_plugins_play to load vars for managed_node2 27885 1726882531.14878: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882531.14883: Calling groups_plugins_play to load vars for managed_node2 27885 1726882531.15202: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882531.15516: done with get_vars() 27885 1726882531.15526: done getting variables 27885 1726882531.15571: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:35:31 -0400 (0:00:00.050) 0:00:03.798 ****** 27885 1726882531.15615: entering _queue_task() for managed_node2/command 27885 1726882531.15928: worker is 1 (out of 1 available) 27885 1726882531.15940: exiting _queue_task() for managed_node2/command 27885 1726882531.15953: done queuing things up, now waiting for results queue to drain 27885 1726882531.15954: waiting for pending results... 27885 1726882531.16219: running TaskExecutor() for managed_node2/TASK: Gather current interface info 27885 1726882531.16444: in run() - task 12673a56-9f93-3fa5-01be-00000000014e 27885 1726882531.16513: variable 'ansible_search_path' from source: unknown 27885 1726882531.16521: variable 'ansible_search_path' from source: unknown 27885 1726882531.16633: calling self._execute() 27885 1726882531.16674: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882531.16688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882531.16719: variable 'omit' from source: magic vars 27885 1726882531.17234: variable 'ansible_distribution_major_version' from source: facts 27885 1726882531.17345: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882531.17348: variable 'omit' from source: magic vars 27885 1726882531.17350: variable 'omit' from source: magic vars 27885 1726882531.17384: variable 'omit' from source: magic vars 27885 1726882531.17468: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882531.17536: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882531.17580: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882531.17612: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882531.17631: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882531.17733: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882531.17736: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882531.17739: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882531.17824: Set connection var ansible_pipelining to False 27885 1726882531.17834: Set connection var ansible_connection to ssh 27885 1726882531.17842: Set connection var ansible_timeout to 10 27885 1726882531.17848: Set connection var ansible_shell_type to sh 27885 1726882531.17856: Set connection var ansible_shell_executable to /bin/sh 27885 1726882531.17864: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882531.17914: variable 'ansible_shell_executable' from source: unknown 27885 1726882531.17945: variable 'ansible_connection' from source: unknown 27885 1726882531.17996: variable 'ansible_module_compression' from source: unknown 27885 1726882531.18000: variable 'ansible_shell_type' from source: unknown 27885 1726882531.18002: variable 'ansible_shell_executable' from source: unknown 27885 1726882531.18004: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882531.18006: variable 'ansible_pipelining' from source: unknown 27885 1726882531.18008: variable 'ansible_timeout' from source: unknown 27885 1726882531.18010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882531.18223: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882531.18317: variable 'omit' from source: magic vars 27885 1726882531.18320: starting attempt loop 27885 1726882531.18323: running the handler 27885 1726882531.18325: _low_level_execute_command(): starting 27885 1726882531.18330: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27885 1726882531.19262: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882531.19321: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882531.19420: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882531.19500: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882531.19555: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882531.21182: stdout chunk (state=3): >>>/root <<< 27885 1726882531.21306: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882531.21514: stderr chunk (state=3): >>><<< 27885 1726882531.21518: stdout chunk (state=3): >>><<< 27885 1726882531.21522: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882531.21524: _low_level_execute_command(): starting 27885 1726882531.21527: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882531.2137313-28075-90828861792457 `" && echo ansible-tmp-1726882531.2137313-28075-90828861792457="` echo /root/.ansible/tmp/ansible-tmp-1726882531.2137313-28075-90828861792457 `" ) && sleep 0' 27885 1726882531.22031: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882531.22042: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882531.22056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882531.22075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882531.22102: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882531.22115: stderr chunk (state=3): >>>debug2: match not found <<< 27885 1726882531.22125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882531.22216: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882531.22228: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882531.22254: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882531.22336: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882531.24204: stdout chunk (state=3): >>>ansible-tmp-1726882531.2137313-28075-90828861792457=/root/.ansible/tmp/ansible-tmp-1726882531.2137313-28075-90828861792457 <<< 27885 1726882531.24353: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882531.24356: stderr chunk (state=3): >>><<< 27885 1726882531.24358: stdout chunk (state=3): >>><<< 27885 1726882531.24399: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882531.2137313-28075-90828861792457=/root/.ansible/tmp/ansible-tmp-1726882531.2137313-28075-90828861792457 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882531.24602: variable 'ansible_module_compression' from source: unknown 27885 1726882531.24605: ANSIBALLZ: Using generic lock for ansible.legacy.command 27885 1726882531.24607: ANSIBALLZ: Acquiring lock 27885 1726882531.24609: ANSIBALLZ: Lock acquired: 140560087758944 27885 1726882531.24611: ANSIBALLZ: Creating module 27885 1726882531.37366: ANSIBALLZ: Writing module into payload 27885 1726882531.37431: ANSIBALLZ: Writing module 27885 1726882531.37448: ANSIBALLZ: Renaming module 27885 1726882531.37453: ANSIBALLZ: Done creating module 27885 1726882531.37468: variable 'ansible_facts' from source: unknown 27885 1726882531.37520: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882531.2137313-28075-90828861792457/AnsiballZ_command.py 27885 1726882531.37629: Sending initial data 27885 1726882531.37633: Sent initial data (155 bytes) 27885 1726882531.38059: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882531.38087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882531.38097: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882531.38099: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882531.38101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882531.38144: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882531.38147: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882531.38162: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882531.38231: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882531.39792: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27885 1726882531.39858: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27885 1726882531.39919: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmp37rn7nnh /root/.ansible/tmp/ansible-tmp-1726882531.2137313-28075-90828861792457/AnsiballZ_command.py <<< 27885 1726882531.39926: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882531.2137313-28075-90828861792457/AnsiballZ_command.py" <<< 27885 1726882531.39978: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmp37rn7nnh" to remote "/root/.ansible/tmp/ansible-tmp-1726882531.2137313-28075-90828861792457/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882531.2137313-28075-90828861792457/AnsiballZ_command.py" <<< 27885 1726882531.40595: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882531.40646: stderr chunk (state=3): >>><<< 27885 1726882531.40649: stdout chunk (state=3): >>><<< 27885 1726882531.40652: done transferring module to remote 27885 1726882531.40656: _low_level_execute_command(): starting 27885 1726882531.40662: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882531.2137313-28075-90828861792457/ /root/.ansible/tmp/ansible-tmp-1726882531.2137313-28075-90828861792457/AnsiballZ_command.py && sleep 0' 27885 1726882531.41371: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882531.41391: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882531.41489: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882531.43219: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882531.43242: stderr chunk (state=3): >>><<< 27885 1726882531.43248: stdout chunk (state=3): >>><<< 27885 1726882531.43262: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882531.43267: _low_level_execute_command(): starting 27885 1726882531.43273: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882531.2137313-28075-90828861792457/AnsiballZ_command.py && sleep 0' 27885 1726882531.43721: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882531.43751: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882531.43796: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882531.43815: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882531.43917: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882531.59394: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:35:31.589502", "end": "2024-09-20 21:35:31.592967", "delta": "0:00:00.003465", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27885 1726882531.60907: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 27885 1726882531.60936: stderr chunk (state=3): >>><<< 27885 1726882531.60939: stdout chunk (state=3): >>><<< 27885 1726882531.60960: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:35:31.589502", "end": "2024-09-20 21:35:31.592967", "delta": "0:00:00.003465", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 27885 1726882531.60988: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882531.2137313-28075-90828861792457/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27885 1726882531.60996: _low_level_execute_command(): starting 27885 1726882531.61002: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882531.2137313-28075-90828861792457/ > /dev/null 2>&1 && sleep 0' 27885 1726882531.61531: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882531.61534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882531.61537: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 27885 1726882531.61539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 27885 1726882531.61541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882531.61588: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882531.61591: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882531.61598: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882531.61659: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882531.63565: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882531.63569: stderr chunk (state=3): >>><<< 27885 1726882531.63577: stdout chunk (state=3): >>><<< 27885 1726882531.63602: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882531.63609: handler run complete 27885 1726882531.63758: Evaluated conditional (False): False 27885 1726882531.63761: attempt loop complete, returning result 27885 1726882531.63763: _execute() done 27885 1726882531.63766: dumping result to json 27885 1726882531.63768: done dumping result, returning 27885 1726882531.63770: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [12673a56-9f93-3fa5-01be-00000000014e] 27885 1726882531.63771: sending task result for task 12673a56-9f93-3fa5-01be-00000000014e 27885 1726882531.63840: done sending task result for task 12673a56-9f93-3fa5-01be-00000000014e ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003465", "end": "2024-09-20 21:35:31.592967", "rc": 0, "start": "2024-09-20 21:35:31.589502" } STDOUT: bonding_masters eth0 lo rpltstbr 27885 1726882531.63916: no more pending results, returning what we have 27885 1726882531.63919: results queue empty 27885 1726882531.63920: checking for any_errors_fatal 27885 1726882531.63921: done checking for any_errors_fatal 27885 1726882531.63922: checking for max_fail_percentage 27885 1726882531.63923: done checking for max_fail_percentage 27885 1726882531.63924: checking to see if all hosts have failed and the running result is not ok 27885 1726882531.63924: done checking to see if all hosts have failed 27885 1726882531.63925: getting the remaining hosts for this loop 27885 1726882531.63927: done getting the remaining hosts for this loop 27885 1726882531.63930: getting the next task for host managed_node2 27885 1726882531.63936: done getting next task for host managed_node2 27885 1726882531.63938: ^ task is: TASK: Set current_interfaces 27885 1726882531.63942: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882531.63945: getting variables 27885 1726882531.63946: in VariableManager get_vars() 27885 1726882531.63986: Calling all_inventory to load vars for managed_node2 27885 1726882531.63988: Calling groups_inventory to load vars for managed_node2 27885 1726882531.63995: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882531.64006: Calling all_plugins_play to load vars for managed_node2 27885 1726882531.64009: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882531.64011: Calling groups_plugins_play to load vars for managed_node2 27885 1726882531.64194: WORKER PROCESS EXITING 27885 1726882531.64325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882531.64574: done with get_vars() 27885 1726882531.64584: done getting variables 27885 1726882531.64640: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:35:31 -0400 (0:00:00.490) 0:00:04.289 ****** 27885 1726882531.64666: entering _queue_task() for managed_node2/set_fact 27885 1726882531.64930: worker is 1 (out of 1 available) 27885 1726882531.64943: exiting _queue_task() for managed_node2/set_fact 27885 1726882531.64955: done queuing things up, now waiting for results queue to drain 27885 1726882531.64956: waiting for pending results... 27885 1726882531.65245: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 27885 1726882531.65279: in run() - task 12673a56-9f93-3fa5-01be-00000000014f 27885 1726882531.65297: variable 'ansible_search_path' from source: unknown 27885 1726882531.65301: variable 'ansible_search_path' from source: unknown 27885 1726882531.65343: calling self._execute() 27885 1726882531.65424: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882531.65430: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882531.65438: variable 'omit' from source: magic vars 27885 1726882531.66249: variable 'ansible_distribution_major_version' from source: facts 27885 1726882531.66260: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882531.66266: variable 'omit' from source: magic vars 27885 1726882531.66314: variable 'omit' from source: magic vars 27885 1726882531.66717: variable '_current_interfaces' from source: set_fact 27885 1726882531.66819: variable 'omit' from source: magic vars 27885 1726882531.66860: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882531.66913: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882531.66978: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882531.66987: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882531.66994: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882531.67031: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882531.67041: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882531.67087: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882531.67177: Set connection var ansible_pipelining to False 27885 1726882531.67198: Set connection var ansible_connection to ssh 27885 1726882531.67216: Set connection var ansible_timeout to 10 27885 1726882531.67226: Set connection var ansible_shell_type to sh 27885 1726882531.67306: Set connection var ansible_shell_executable to /bin/sh 27885 1726882531.67312: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882531.67318: variable 'ansible_shell_executable' from source: unknown 27885 1726882531.67321: variable 'ansible_connection' from source: unknown 27885 1726882531.67324: variable 'ansible_module_compression' from source: unknown 27885 1726882531.67326: variable 'ansible_shell_type' from source: unknown 27885 1726882531.67328: variable 'ansible_shell_executable' from source: unknown 27885 1726882531.67330: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882531.67332: variable 'ansible_pipelining' from source: unknown 27885 1726882531.67334: variable 'ansible_timeout' from source: unknown 27885 1726882531.67336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882531.67476: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882531.67492: variable 'omit' from source: magic vars 27885 1726882531.67506: starting attempt loop 27885 1726882531.67514: running the handler 27885 1726882531.67542: handler run complete 27885 1726882531.67634: attempt loop complete, returning result 27885 1726882531.67641: _execute() done 27885 1726882531.67647: dumping result to json 27885 1726882531.67649: done dumping result, returning 27885 1726882531.67652: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [12673a56-9f93-3fa5-01be-00000000014f] 27885 1726882531.67654: sending task result for task 12673a56-9f93-3fa5-01be-00000000014f 27885 1726882531.67719: done sending task result for task 12673a56-9f93-3fa5-01be-00000000014f 27885 1726882531.67722: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo", "rpltstbr" ] }, "changed": false } 27885 1726882531.67813: no more pending results, returning what we have 27885 1726882531.67816: results queue empty 27885 1726882531.67817: checking for any_errors_fatal 27885 1726882531.67825: done checking for any_errors_fatal 27885 1726882531.67825: checking for max_fail_percentage 27885 1726882531.67827: done checking for max_fail_percentage 27885 1726882531.67828: checking to see if all hosts have failed and the running result is not ok 27885 1726882531.67829: done checking to see if all hosts have failed 27885 1726882531.67829: getting the remaining hosts for this loop 27885 1726882531.67831: done getting the remaining hosts for this loop 27885 1726882531.67835: getting the next task for host managed_node2 27885 1726882531.67848: done getting next task for host managed_node2 27885 1726882531.67852: ^ task is: TASK: Show current_interfaces 27885 1726882531.67856: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882531.67860: getting variables 27885 1726882531.67862: in VariableManager get_vars() 27885 1726882531.67904: Calling all_inventory to load vars for managed_node2 27885 1726882531.67907: Calling groups_inventory to load vars for managed_node2 27885 1726882531.67910: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882531.67921: Calling all_plugins_play to load vars for managed_node2 27885 1726882531.67924: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882531.67928: Calling groups_plugins_play to load vars for managed_node2 27885 1726882531.68349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882531.68869: done with get_vars() 27885 1726882531.68879: done getting variables 27885 1726882531.69197: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:35:31 -0400 (0:00:00.045) 0:00:04.334 ****** 27885 1726882531.69226: entering _queue_task() for managed_node2/debug 27885 1726882531.69228: Creating lock for debug 27885 1726882531.69749: worker is 1 (out of 1 available) 27885 1726882531.69762: exiting _queue_task() for managed_node2/debug 27885 1726882531.69773: done queuing things up, now waiting for results queue to drain 27885 1726882531.69774: waiting for pending results... 27885 1726882531.70097: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 27885 1726882531.70118: in run() - task 12673a56-9f93-3fa5-01be-000000000136 27885 1726882531.70127: variable 'ansible_search_path' from source: unknown 27885 1726882531.70131: variable 'ansible_search_path' from source: unknown 27885 1726882531.70256: calling self._execute() 27885 1726882531.70601: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882531.70606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882531.70609: variable 'omit' from source: magic vars 27885 1726882531.70926: variable 'ansible_distribution_major_version' from source: facts 27885 1726882531.70933: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882531.70939: variable 'omit' from source: magic vars 27885 1726882531.70974: variable 'omit' from source: magic vars 27885 1726882531.71069: variable 'current_interfaces' from source: set_fact 27885 1726882531.71102: variable 'omit' from source: magic vars 27885 1726882531.71139: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882531.71323: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882531.71326: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882531.71329: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882531.71331: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882531.71334: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882531.71337: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882531.71339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882531.71353: Set connection var ansible_pipelining to False 27885 1726882531.71363: Set connection var ansible_connection to ssh 27885 1726882531.71369: Set connection var ansible_timeout to 10 27885 1726882531.71371: Set connection var ansible_shell_type to sh 27885 1726882531.71373: Set connection var ansible_shell_executable to /bin/sh 27885 1726882531.71376: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882531.71404: variable 'ansible_shell_executable' from source: unknown 27885 1726882531.71408: variable 'ansible_connection' from source: unknown 27885 1726882531.71410: variable 'ansible_module_compression' from source: unknown 27885 1726882531.71413: variable 'ansible_shell_type' from source: unknown 27885 1726882531.71415: variable 'ansible_shell_executable' from source: unknown 27885 1726882531.71417: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882531.71419: variable 'ansible_pipelining' from source: unknown 27885 1726882531.71423: variable 'ansible_timeout' from source: unknown 27885 1726882531.71427: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882531.71557: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882531.71567: variable 'omit' from source: magic vars 27885 1726882531.71583: starting attempt loop 27885 1726882531.71586: running the handler 27885 1726882531.71694: handler run complete 27885 1726882531.71697: attempt loop complete, returning result 27885 1726882531.71699: _execute() done 27885 1726882531.71700: dumping result to json 27885 1726882531.71702: done dumping result, returning 27885 1726882531.71705: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [12673a56-9f93-3fa5-01be-000000000136] 27885 1726882531.71706: sending task result for task 12673a56-9f93-3fa5-01be-000000000136 27885 1726882531.71759: done sending task result for task 12673a56-9f93-3fa5-01be-000000000136 27885 1726882531.71761: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo', 'rpltstbr'] 27885 1726882531.71832: no more pending results, returning what we have 27885 1726882531.71835: results queue empty 27885 1726882531.71836: checking for any_errors_fatal 27885 1726882531.71840: done checking for any_errors_fatal 27885 1726882531.71841: checking for max_fail_percentage 27885 1726882531.71842: done checking for max_fail_percentage 27885 1726882531.71843: checking to see if all hosts have failed and the running result is not ok 27885 1726882531.71843: done checking to see if all hosts have failed 27885 1726882531.71844: getting the remaining hosts for this loop 27885 1726882531.71845: done getting the remaining hosts for this loop 27885 1726882531.71848: getting the next task for host managed_node2 27885 1726882531.71853: done getting next task for host managed_node2 27885 1726882531.71855: ^ task is: TASK: Manage test interface 27885 1726882531.71857: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882531.71860: getting variables 27885 1726882531.71861: in VariableManager get_vars() 27885 1726882531.71891: Calling all_inventory to load vars for managed_node2 27885 1726882531.71895: Calling groups_inventory to load vars for managed_node2 27885 1726882531.71898: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882531.71907: Calling all_plugins_play to load vars for managed_node2 27885 1726882531.71909: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882531.71912: Calling groups_plugins_play to load vars for managed_node2 27885 1726882531.72326: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882531.72542: done with get_vars() 27885 1726882531.72550: done getting variables TASK [Manage test interface] *************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:17 Friday 20 September 2024 21:35:31 -0400 (0:00:00.034) 0:00:04.368 ****** 27885 1726882531.72634: entering _queue_task() for managed_node2/include_tasks 27885 1726882531.72847: worker is 1 (out of 1 available) 27885 1726882531.72858: exiting _queue_task() for managed_node2/include_tasks 27885 1726882531.72985: done queuing things up, now waiting for results queue to drain 27885 1726882531.72987: waiting for pending results... 27885 1726882531.73214: running TaskExecutor() for managed_node2/TASK: Manage test interface 27885 1726882531.73219: in run() - task 12673a56-9f93-3fa5-01be-00000000000d 27885 1726882531.73233: variable 'ansible_search_path' from source: unknown 27885 1726882531.73274: calling self._execute() 27885 1726882531.73376: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882531.73390: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882531.73407: variable 'omit' from source: magic vars 27885 1726882531.74021: variable 'ansible_distribution_major_version' from source: facts 27885 1726882531.74025: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882531.74027: _execute() done 27885 1726882531.74030: dumping result to json 27885 1726882531.74032: done dumping result, returning 27885 1726882531.74035: done running TaskExecutor() for managed_node2/TASK: Manage test interface [12673a56-9f93-3fa5-01be-00000000000d] 27885 1726882531.74037: sending task result for task 12673a56-9f93-3fa5-01be-00000000000d 27885 1726882531.74200: no more pending results, returning what we have 27885 1726882531.74205: in VariableManager get_vars() 27885 1726882531.74246: Calling all_inventory to load vars for managed_node2 27885 1726882531.74249: Calling groups_inventory to load vars for managed_node2 27885 1726882531.74251: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882531.74263: Calling all_plugins_play to load vars for managed_node2 27885 1726882531.74266: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882531.74269: Calling groups_plugins_play to load vars for managed_node2 27885 1726882531.74586: done sending task result for task 12673a56-9f93-3fa5-01be-00000000000d 27885 1726882531.74589: WORKER PROCESS EXITING 27885 1726882531.74617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882531.74834: done with get_vars() 27885 1726882531.74842: variable 'ansible_search_path' from source: unknown 27885 1726882531.74853: we have included files to process 27885 1726882531.74854: generating all_blocks data 27885 1726882531.74856: done generating all_blocks data 27885 1726882531.74860: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 27885 1726882531.74861: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 27885 1726882531.74863: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 27885 1726882531.75404: in VariableManager get_vars() 27885 1726882531.75425: done with get_vars() 27885 1726882531.75640: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 27885 1726882531.76212: done processing included file 27885 1726882531.76214: iterating over new_blocks loaded from include file 27885 1726882531.76215: in VariableManager get_vars() 27885 1726882531.76232: done with get_vars() 27885 1726882531.76234: filtering new block on tags 27885 1726882531.76270: done filtering new block on tags 27885 1726882531.76273: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node2 27885 1726882531.76289: extending task lists for all hosts with included blocks 27885 1726882531.76590: done extending task lists 27885 1726882531.76592: done processing included files 27885 1726882531.76594: results queue empty 27885 1726882531.76595: checking for any_errors_fatal 27885 1726882531.76597: done checking for any_errors_fatal 27885 1726882531.76598: checking for max_fail_percentage 27885 1726882531.76600: done checking for max_fail_percentage 27885 1726882531.76600: checking to see if all hosts have failed and the running result is not ok 27885 1726882531.76601: done checking to see if all hosts have failed 27885 1726882531.76602: getting the remaining hosts for this loop 27885 1726882531.76603: done getting the remaining hosts for this loop 27885 1726882531.76606: getting the next task for host managed_node2 27885 1726882531.76609: done getting next task for host managed_node2 27885 1726882531.76612: ^ task is: TASK: Ensure state in ["present", "absent"] 27885 1726882531.76614: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882531.76616: getting variables 27885 1726882531.76617: in VariableManager get_vars() 27885 1726882531.76630: Calling all_inventory to load vars for managed_node2 27885 1726882531.76632: Calling groups_inventory to load vars for managed_node2 27885 1726882531.76634: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882531.76640: Calling all_plugins_play to load vars for managed_node2 27885 1726882531.76642: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882531.76645: Calling groups_plugins_play to load vars for managed_node2 27885 1726882531.76808: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882531.77029: done with get_vars() 27885 1726882531.77038: done getting variables 27885 1726882531.77098: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Friday 20 September 2024 21:35:31 -0400 (0:00:00.044) 0:00:04.413 ****** 27885 1726882531.77129: entering _queue_task() for managed_node2/fail 27885 1726882531.77131: Creating lock for fail 27885 1726882531.77599: worker is 1 (out of 1 available) 27885 1726882531.77606: exiting _queue_task() for managed_node2/fail 27885 1726882531.77618: done queuing things up, now waiting for results queue to drain 27885 1726882531.77619: waiting for pending results... 27885 1726882531.77638: running TaskExecutor() for managed_node2/TASK: Ensure state in ["present", "absent"] 27885 1726882531.77740: in run() - task 12673a56-9f93-3fa5-01be-00000000016a 27885 1726882531.77765: variable 'ansible_search_path' from source: unknown 27885 1726882531.77774: variable 'ansible_search_path' from source: unknown 27885 1726882531.77815: calling self._execute() 27885 1726882531.77906: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882531.77918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882531.77934: variable 'omit' from source: magic vars 27885 1726882531.78285: variable 'ansible_distribution_major_version' from source: facts 27885 1726882531.78306: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882531.78441: variable 'state' from source: include params 27885 1726882531.78451: Evaluated conditional (state not in ["present", "absent"]): False 27885 1726882531.78457: when evaluation is False, skipping this task 27885 1726882531.78463: _execute() done 27885 1726882531.78469: dumping result to json 27885 1726882531.78475: done dumping result, returning 27885 1726882531.78484: done running TaskExecutor() for managed_node2/TASK: Ensure state in ["present", "absent"] [12673a56-9f93-3fa5-01be-00000000016a] 27885 1726882531.78496: sending task result for task 12673a56-9f93-3fa5-01be-00000000016a skipping: [managed_node2] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 27885 1726882531.78641: no more pending results, returning what we have 27885 1726882531.78644: results queue empty 27885 1726882531.78645: checking for any_errors_fatal 27885 1726882531.78646: done checking for any_errors_fatal 27885 1726882531.78647: checking for max_fail_percentage 27885 1726882531.78648: done checking for max_fail_percentage 27885 1726882531.78649: checking to see if all hosts have failed and the running result is not ok 27885 1726882531.78649: done checking to see if all hosts have failed 27885 1726882531.78650: getting the remaining hosts for this loop 27885 1726882531.78651: done getting the remaining hosts for this loop 27885 1726882531.78654: getting the next task for host managed_node2 27885 1726882531.78660: done getting next task for host managed_node2 27885 1726882531.78662: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 27885 1726882531.78666: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882531.78670: getting variables 27885 1726882531.78671: in VariableManager get_vars() 27885 1726882531.78707: Calling all_inventory to load vars for managed_node2 27885 1726882531.78710: Calling groups_inventory to load vars for managed_node2 27885 1726882531.78712: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882531.78839: Calling all_plugins_play to load vars for managed_node2 27885 1726882531.78843: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882531.78848: done sending task result for task 12673a56-9f93-3fa5-01be-00000000016a 27885 1726882531.78851: WORKER PROCESS EXITING 27885 1726882531.78855: Calling groups_plugins_play to load vars for managed_node2 27885 1726882531.79148: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882531.79404: done with get_vars() 27885 1726882531.79413: done getting variables 27885 1726882531.79462: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Friday 20 September 2024 21:35:31 -0400 (0:00:00.023) 0:00:04.437 ****** 27885 1726882531.79504: entering _queue_task() for managed_node2/fail 27885 1726882531.79912: worker is 1 (out of 1 available) 27885 1726882531.79918: exiting _queue_task() for managed_node2/fail 27885 1726882531.79927: done queuing things up, now waiting for results queue to drain 27885 1726882531.79928: waiting for pending results... 27885 1726882531.79975: running TaskExecutor() for managed_node2/TASK: Ensure type in ["dummy", "tap", "veth"] 27885 1726882531.80080: in run() - task 12673a56-9f93-3fa5-01be-00000000016b 27885 1726882531.80103: variable 'ansible_search_path' from source: unknown 27885 1726882531.80111: variable 'ansible_search_path' from source: unknown 27885 1726882531.80151: calling self._execute() 27885 1726882531.80230: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882531.80243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882531.80264: variable 'omit' from source: magic vars 27885 1726882531.80613: variable 'ansible_distribution_major_version' from source: facts 27885 1726882531.80630: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882531.80774: variable 'type' from source: set_fact 27885 1726882531.80785: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 27885 1726882531.80792: when evaluation is False, skipping this task 27885 1726882531.80809: _execute() done 27885 1726882531.80908: dumping result to json 27885 1726882531.80915: done dumping result, returning 27885 1726882531.80918: done running TaskExecutor() for managed_node2/TASK: Ensure type in ["dummy", "tap", "veth"] [12673a56-9f93-3fa5-01be-00000000016b] 27885 1726882531.80920: sending task result for task 12673a56-9f93-3fa5-01be-00000000016b 27885 1726882531.80975: done sending task result for task 12673a56-9f93-3fa5-01be-00000000016b 27885 1726882531.80977: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 27885 1726882531.81062: no more pending results, returning what we have 27885 1726882531.81065: results queue empty 27885 1726882531.81067: checking for any_errors_fatal 27885 1726882531.81073: done checking for any_errors_fatal 27885 1726882531.81074: checking for max_fail_percentage 27885 1726882531.81075: done checking for max_fail_percentage 27885 1726882531.81076: checking to see if all hosts have failed and the running result is not ok 27885 1726882531.81077: done checking to see if all hosts have failed 27885 1726882531.81078: getting the remaining hosts for this loop 27885 1726882531.81079: done getting the remaining hosts for this loop 27885 1726882531.81082: getting the next task for host managed_node2 27885 1726882531.81088: done getting next task for host managed_node2 27885 1726882531.81090: ^ task is: TASK: Include the task 'show_interfaces.yml' 27885 1726882531.81095: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882531.81099: getting variables 27885 1726882531.81101: in VariableManager get_vars() 27885 1726882531.81141: Calling all_inventory to load vars for managed_node2 27885 1726882531.81144: Calling groups_inventory to load vars for managed_node2 27885 1726882531.81146: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882531.81156: Calling all_plugins_play to load vars for managed_node2 27885 1726882531.81159: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882531.81162: Calling groups_plugins_play to load vars for managed_node2 27885 1726882531.81390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882531.81556: done with get_vars() 27885 1726882531.81563: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Friday 20 September 2024 21:35:31 -0400 (0:00:00.021) 0:00:04.458 ****** 27885 1726882531.81622: entering _queue_task() for managed_node2/include_tasks 27885 1726882531.81774: worker is 1 (out of 1 available) 27885 1726882531.81786: exiting _queue_task() for managed_node2/include_tasks 27885 1726882531.81797: done queuing things up, now waiting for results queue to drain 27885 1726882531.81798: waiting for pending results... 27885 1726882531.81941: running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' 27885 1726882531.82006: in run() - task 12673a56-9f93-3fa5-01be-00000000016c 27885 1726882531.82020: variable 'ansible_search_path' from source: unknown 27885 1726882531.82024: variable 'ansible_search_path' from source: unknown 27885 1726882531.82051: calling self._execute() 27885 1726882531.82113: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882531.82116: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882531.82125: variable 'omit' from source: magic vars 27885 1726882531.82376: variable 'ansible_distribution_major_version' from source: facts 27885 1726882531.82386: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882531.82391: _execute() done 27885 1726882531.82399: dumping result to json 27885 1726882531.82402: done dumping result, returning 27885 1726882531.82407: done running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' [12673a56-9f93-3fa5-01be-00000000016c] 27885 1726882531.82413: sending task result for task 12673a56-9f93-3fa5-01be-00000000016c 27885 1726882531.82489: done sending task result for task 12673a56-9f93-3fa5-01be-00000000016c 27885 1726882531.82492: WORKER PROCESS EXITING 27885 1726882531.82518: no more pending results, returning what we have 27885 1726882531.82522: in VariableManager get_vars() 27885 1726882531.82558: Calling all_inventory to load vars for managed_node2 27885 1726882531.82560: Calling groups_inventory to load vars for managed_node2 27885 1726882531.82562: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882531.82570: Calling all_plugins_play to load vars for managed_node2 27885 1726882531.82572: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882531.82574: Calling groups_plugins_play to load vars for managed_node2 27885 1726882531.82700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882531.82825: done with get_vars() 27885 1726882531.82831: variable 'ansible_search_path' from source: unknown 27885 1726882531.82831: variable 'ansible_search_path' from source: unknown 27885 1726882531.82853: we have included files to process 27885 1726882531.82853: generating all_blocks data 27885 1726882531.82854: done generating all_blocks data 27885 1726882531.82857: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 27885 1726882531.82858: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 27885 1726882531.82859: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 27885 1726882531.82921: in VariableManager get_vars() 27885 1726882531.82937: done with get_vars() 27885 1726882531.83027: done processing included file 27885 1726882531.83029: iterating over new_blocks loaded from include file 27885 1726882531.83031: in VariableManager get_vars() 27885 1726882531.83049: done with get_vars() 27885 1726882531.83052: filtering new block on tags 27885 1726882531.83068: done filtering new block on tags 27885 1726882531.83070: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 27885 1726882531.83074: extending task lists for all hosts with included blocks 27885 1726882531.83514: done extending task lists 27885 1726882531.83515: done processing included files 27885 1726882531.83516: results queue empty 27885 1726882531.83516: checking for any_errors_fatal 27885 1726882531.83519: done checking for any_errors_fatal 27885 1726882531.83519: checking for max_fail_percentage 27885 1726882531.83520: done checking for max_fail_percentage 27885 1726882531.83521: checking to see if all hosts have failed and the running result is not ok 27885 1726882531.83522: done checking to see if all hosts have failed 27885 1726882531.83522: getting the remaining hosts for this loop 27885 1726882531.83523: done getting the remaining hosts for this loop 27885 1726882531.83525: getting the next task for host managed_node2 27885 1726882531.83529: done getting next task for host managed_node2 27885 1726882531.83531: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 27885 1726882531.83534: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882531.83536: getting variables 27885 1726882531.83536: in VariableManager get_vars() 27885 1726882531.83574: Calling all_inventory to load vars for managed_node2 27885 1726882531.83576: Calling groups_inventory to load vars for managed_node2 27885 1726882531.83578: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882531.83582: Calling all_plugins_play to load vars for managed_node2 27885 1726882531.83585: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882531.83587: Calling groups_plugins_play to load vars for managed_node2 27885 1726882531.83751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882531.83956: done with get_vars() 27885 1726882531.83963: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:35:31 -0400 (0:00:00.023) 0:00:04.482 ****** 27885 1726882531.84009: entering _queue_task() for managed_node2/include_tasks 27885 1726882531.84171: worker is 1 (out of 1 available) 27885 1726882531.84183: exiting _queue_task() for managed_node2/include_tasks 27885 1726882531.84196: done queuing things up, now waiting for results queue to drain 27885 1726882531.84198: waiting for pending results... 27885 1726882531.84333: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 27885 1726882531.84392: in run() - task 12673a56-9f93-3fa5-01be-00000000019d 27885 1726882531.84407: variable 'ansible_search_path' from source: unknown 27885 1726882531.84411: variable 'ansible_search_path' from source: unknown 27885 1726882531.84440: calling self._execute() 27885 1726882531.84497: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882531.84503: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882531.84511: variable 'omit' from source: magic vars 27885 1726882531.84748: variable 'ansible_distribution_major_version' from source: facts 27885 1726882531.84758: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882531.84762: _execute() done 27885 1726882531.84765: dumping result to json 27885 1726882531.84768: done dumping result, returning 27885 1726882531.84775: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [12673a56-9f93-3fa5-01be-00000000019d] 27885 1726882531.84778: sending task result for task 12673a56-9f93-3fa5-01be-00000000019d 27885 1726882531.84855: done sending task result for task 12673a56-9f93-3fa5-01be-00000000019d 27885 1726882531.84860: WORKER PROCESS EXITING 27885 1726882531.84886: no more pending results, returning what we have 27885 1726882531.84890: in VariableManager get_vars() 27885 1726882531.84927: Calling all_inventory to load vars for managed_node2 27885 1726882531.84929: Calling groups_inventory to load vars for managed_node2 27885 1726882531.84931: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882531.84939: Calling all_plugins_play to load vars for managed_node2 27885 1726882531.84942: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882531.84944: Calling groups_plugins_play to load vars for managed_node2 27885 1726882531.85063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882531.85202: done with get_vars() 27885 1726882531.85208: variable 'ansible_search_path' from source: unknown 27885 1726882531.85208: variable 'ansible_search_path' from source: unknown 27885 1726882531.85243: we have included files to process 27885 1726882531.85244: generating all_blocks data 27885 1726882531.85245: done generating all_blocks data 27885 1726882531.85245: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 27885 1726882531.85246: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 27885 1726882531.85247: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 27885 1726882531.85401: done processing included file 27885 1726882531.85403: iterating over new_blocks loaded from include file 27885 1726882531.85404: in VariableManager get_vars() 27885 1726882531.85417: done with get_vars() 27885 1726882531.85419: filtering new block on tags 27885 1726882531.85429: done filtering new block on tags 27885 1726882531.85430: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 27885 1726882531.85433: extending task lists for all hosts with included blocks 27885 1726882531.85513: done extending task lists 27885 1726882531.85514: done processing included files 27885 1726882531.85514: results queue empty 27885 1726882531.85515: checking for any_errors_fatal 27885 1726882531.85516: done checking for any_errors_fatal 27885 1726882531.85517: checking for max_fail_percentage 27885 1726882531.85518: done checking for max_fail_percentage 27885 1726882531.85519: checking to see if all hosts have failed and the running result is not ok 27885 1726882531.85519: done checking to see if all hosts have failed 27885 1726882531.85520: getting the remaining hosts for this loop 27885 1726882531.85521: done getting the remaining hosts for this loop 27885 1726882531.85523: getting the next task for host managed_node2 27885 1726882531.85526: done getting next task for host managed_node2 27885 1726882531.85527: ^ task is: TASK: Gather current interface info 27885 1726882531.85529: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882531.85531: getting variables 27885 1726882531.85531: in VariableManager get_vars() 27885 1726882531.85539: Calling all_inventory to load vars for managed_node2 27885 1726882531.85540: Calling groups_inventory to load vars for managed_node2 27885 1726882531.85542: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882531.85545: Calling all_plugins_play to load vars for managed_node2 27885 1726882531.85546: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882531.85548: Calling groups_plugins_play to load vars for managed_node2 27885 1726882531.85634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882531.85761: done with get_vars() 27885 1726882531.85770: done getting variables 27885 1726882531.85798: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:35:31 -0400 (0:00:00.018) 0:00:04.500 ****** 27885 1726882531.85816: entering _queue_task() for managed_node2/command 27885 1726882531.86021: worker is 1 (out of 1 available) 27885 1726882531.86033: exiting _queue_task() for managed_node2/command 27885 1726882531.86045: done queuing things up, now waiting for results queue to drain 27885 1726882531.86046: waiting for pending results... 27885 1726882531.86274: running TaskExecutor() for managed_node2/TASK: Gather current interface info 27885 1726882531.86480: in run() - task 12673a56-9f93-3fa5-01be-0000000001d4 27885 1726882531.86484: variable 'ansible_search_path' from source: unknown 27885 1726882531.86487: variable 'ansible_search_path' from source: unknown 27885 1726882531.86491: calling self._execute() 27885 1726882531.86543: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882531.86556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882531.86570: variable 'omit' from source: magic vars 27885 1726882531.87008: variable 'ansible_distribution_major_version' from source: facts 27885 1726882531.87019: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882531.87025: variable 'omit' from source: magic vars 27885 1726882531.87075: variable 'omit' from source: magic vars 27885 1726882531.87119: variable 'omit' from source: magic vars 27885 1726882531.87163: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882531.87187: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882531.87218: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882531.87236: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882531.87245: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882531.87285: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882531.87288: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882531.87290: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882531.87384: Set connection var ansible_pipelining to False 27885 1726882531.87388: Set connection var ansible_connection to ssh 27885 1726882531.87408: Set connection var ansible_timeout to 10 27885 1726882531.87411: Set connection var ansible_shell_type to sh 27885 1726882531.87413: Set connection var ansible_shell_executable to /bin/sh 27885 1726882531.87415: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882531.87442: variable 'ansible_shell_executable' from source: unknown 27885 1726882531.87445: variable 'ansible_connection' from source: unknown 27885 1726882531.87449: variable 'ansible_module_compression' from source: unknown 27885 1726882531.87451: variable 'ansible_shell_type' from source: unknown 27885 1726882531.87453: variable 'ansible_shell_executable' from source: unknown 27885 1726882531.87455: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882531.87457: variable 'ansible_pipelining' from source: unknown 27885 1726882531.87460: variable 'ansible_timeout' from source: unknown 27885 1726882531.87478: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882531.87585: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882531.87596: variable 'omit' from source: magic vars 27885 1726882531.87603: starting attempt loop 27885 1726882531.87607: running the handler 27885 1726882531.87620: _low_level_execute_command(): starting 27885 1726882531.87626: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27885 1726882531.88091: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882531.88120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 27885 1726882531.88124: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882531.88178: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882531.88181: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882531.88185: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882531.88253: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882531.89968: stdout chunk (state=3): >>>/root <<< 27885 1726882531.90067: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882531.90111: stderr chunk (state=3): >>><<< 27885 1726882531.90114: stdout chunk (state=3): >>><<< 27885 1726882531.90173: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882531.90176: _low_level_execute_command(): starting 27885 1726882531.90179: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882531.9012756-28110-47188353197337 `" && echo ansible-tmp-1726882531.9012756-28110-47188353197337="` echo /root/.ansible/tmp/ansible-tmp-1726882531.9012756-28110-47188353197337 `" ) && sleep 0' 27885 1726882531.90731: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882531.90797: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882531.90801: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882531.90886: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882531.92796: stdout chunk (state=3): >>>ansible-tmp-1726882531.9012756-28110-47188353197337=/root/.ansible/tmp/ansible-tmp-1726882531.9012756-28110-47188353197337 <<< 27885 1726882531.92956: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882531.92960: stdout chunk (state=3): >>><<< 27885 1726882531.92964: stderr chunk (state=3): >>><<< 27885 1726882531.92984: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882531.9012756-28110-47188353197337=/root/.ansible/tmp/ansible-tmp-1726882531.9012756-28110-47188353197337 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882531.93202: variable 'ansible_module_compression' from source: unknown 27885 1726882531.93206: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27885_1t3g5hz/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27885 1726882531.93208: variable 'ansible_facts' from source: unknown 27885 1726882531.93210: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882531.9012756-28110-47188353197337/AnsiballZ_command.py 27885 1726882531.93433: Sending initial data 27885 1726882531.93436: Sent initial data (155 bytes) 27885 1726882531.93863: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882531.93879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882531.93924: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882531.93936: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882531.94001: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882531.95566: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 27885 1726882531.95576: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27885 1726882531.95624: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27885 1726882531.95687: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpu2l6dc4e /root/.ansible/tmp/ansible-tmp-1726882531.9012756-28110-47188353197337/AnsiballZ_command.py <<< 27885 1726882531.95695: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882531.9012756-28110-47188353197337/AnsiballZ_command.py" <<< 27885 1726882531.95745: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpu2l6dc4e" to remote "/root/.ansible/tmp/ansible-tmp-1726882531.9012756-28110-47188353197337/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882531.9012756-28110-47188353197337/AnsiballZ_command.py" <<< 27885 1726882531.96578: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882531.96623: stderr chunk (state=3): >>><<< 27885 1726882531.96626: stdout chunk (state=3): >>><<< 27885 1726882531.96653: done transferring module to remote 27885 1726882531.96659: _low_level_execute_command(): starting 27885 1726882531.96662: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882531.9012756-28110-47188353197337/ /root/.ansible/tmp/ansible-tmp-1726882531.9012756-28110-47188353197337/AnsiballZ_command.py && sleep 0' 27885 1726882531.97076: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882531.97080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882531.97084: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882531.97181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882531.97187: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882531.97190: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882531.97208: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882531.97280: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882531.99052: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882531.99088: stderr chunk (state=3): >>><<< 27885 1726882531.99094: stdout chunk (state=3): >>><<< 27885 1726882531.99101: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882531.99103: _low_level_execute_command(): starting 27885 1726882531.99108: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882531.9012756-28110-47188353197337/AnsiballZ_command.py && sleep 0' 27885 1726882531.99666: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882531.99669: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882531.99675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882531.99737: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882531.99795: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882532.15194: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:35:32.147651", "end": "2024-09-20 21:35:32.151117", "delta": "0:00:00.003466", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27885 1726882532.16751: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 27885 1726882532.16778: stderr chunk (state=3): >>><<< 27885 1726882532.16781: stdout chunk (state=3): >>><<< 27885 1726882532.16817: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:35:32.147651", "end": "2024-09-20 21:35:32.151117", "delta": "0:00:00.003466", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 27885 1726882532.16856: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882531.9012756-28110-47188353197337/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27885 1726882532.16863: _low_level_execute_command(): starting 27885 1726882532.16868: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882531.9012756-28110-47188353197337/ > /dev/null 2>&1 && sleep 0' 27885 1726882532.17472: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882532.17476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882532.17479: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 27885 1726882532.17481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882532.17520: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882532.17585: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882532.19443: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882532.19480: stderr chunk (state=3): >>><<< 27885 1726882532.19483: stdout chunk (state=3): >>><<< 27885 1726882532.19500: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882532.19506: handler run complete 27885 1726882532.19525: Evaluated conditional (False): False 27885 1726882532.19534: attempt loop complete, returning result 27885 1726882532.19537: _execute() done 27885 1726882532.19539: dumping result to json 27885 1726882532.19546: done dumping result, returning 27885 1726882532.19552: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [12673a56-9f93-3fa5-01be-0000000001d4] 27885 1726882532.19557: sending task result for task 12673a56-9f93-3fa5-01be-0000000001d4 27885 1726882532.19681: done sending task result for task 12673a56-9f93-3fa5-01be-0000000001d4 27885 1726882532.19683: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003466", "end": "2024-09-20 21:35:32.151117", "rc": 0, "start": "2024-09-20 21:35:32.147651" } STDOUT: bonding_masters eth0 lo rpltstbr 27885 1726882532.19789: no more pending results, returning what we have 27885 1726882532.19792: results queue empty 27885 1726882532.19795: checking for any_errors_fatal 27885 1726882532.19797: done checking for any_errors_fatal 27885 1726882532.19797: checking for max_fail_percentage 27885 1726882532.19799: done checking for max_fail_percentage 27885 1726882532.19799: checking to see if all hosts have failed and the running result is not ok 27885 1726882532.19800: done checking to see if all hosts have failed 27885 1726882532.19801: getting the remaining hosts for this loop 27885 1726882532.19802: done getting the remaining hosts for this loop 27885 1726882532.19806: getting the next task for host managed_node2 27885 1726882532.19814: done getting next task for host managed_node2 27885 1726882532.19816: ^ task is: TASK: Set current_interfaces 27885 1726882532.19821: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882532.19826: getting variables 27885 1726882532.19827: in VariableManager get_vars() 27885 1726882532.19928: Calling all_inventory to load vars for managed_node2 27885 1726882532.19931: Calling groups_inventory to load vars for managed_node2 27885 1726882532.19932: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882532.19940: Calling all_plugins_play to load vars for managed_node2 27885 1726882532.19941: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882532.19943: Calling groups_plugins_play to load vars for managed_node2 27885 1726882532.20057: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882532.20185: done with get_vars() 27885 1726882532.20199: done getting variables 27885 1726882532.20252: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:35:32 -0400 (0:00:00.344) 0:00:04.845 ****** 27885 1726882532.20275: entering _queue_task() for managed_node2/set_fact 27885 1726882532.20478: worker is 1 (out of 1 available) 27885 1726882532.20496: exiting _queue_task() for managed_node2/set_fact 27885 1726882532.20507: done queuing things up, now waiting for results queue to drain 27885 1726882532.20508: waiting for pending results... 27885 1726882532.20654: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 27885 1726882532.20742: in run() - task 12673a56-9f93-3fa5-01be-0000000001d5 27885 1726882532.20768: variable 'ansible_search_path' from source: unknown 27885 1726882532.20772: variable 'ansible_search_path' from source: unknown 27885 1726882532.20798: calling self._execute() 27885 1726882532.20873: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882532.20879: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882532.20886: variable 'omit' from source: magic vars 27885 1726882532.21208: variable 'ansible_distribution_major_version' from source: facts 27885 1726882532.21219: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882532.21225: variable 'omit' from source: magic vars 27885 1726882532.21257: variable 'omit' from source: magic vars 27885 1726882532.21342: variable '_current_interfaces' from source: set_fact 27885 1726882532.21389: variable 'omit' from source: magic vars 27885 1726882532.21422: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882532.21448: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882532.21467: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882532.21512: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882532.21531: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882532.21561: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882532.21564: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882532.21571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882532.21670: Set connection var ansible_pipelining to False 27885 1726882532.21673: Set connection var ansible_connection to ssh 27885 1726882532.21678: Set connection var ansible_timeout to 10 27885 1726882532.21681: Set connection var ansible_shell_type to sh 27885 1726882532.21686: Set connection var ansible_shell_executable to /bin/sh 27885 1726882532.21694: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882532.21712: variable 'ansible_shell_executable' from source: unknown 27885 1726882532.21716: variable 'ansible_connection' from source: unknown 27885 1726882532.21718: variable 'ansible_module_compression' from source: unknown 27885 1726882532.21720: variable 'ansible_shell_type' from source: unknown 27885 1726882532.21723: variable 'ansible_shell_executable' from source: unknown 27885 1726882532.21736: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882532.21739: variable 'ansible_pipelining' from source: unknown 27885 1726882532.21741: variable 'ansible_timeout' from source: unknown 27885 1726882532.21743: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882532.21855: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882532.21865: variable 'omit' from source: magic vars 27885 1726882532.21868: starting attempt loop 27885 1726882532.21872: running the handler 27885 1726882532.21883: handler run complete 27885 1726882532.21895: attempt loop complete, returning result 27885 1726882532.21898: _execute() done 27885 1726882532.21901: dumping result to json 27885 1726882532.21903: done dumping result, returning 27885 1726882532.21908: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [12673a56-9f93-3fa5-01be-0000000001d5] 27885 1726882532.21911: sending task result for task 12673a56-9f93-3fa5-01be-0000000001d5 ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo", "rpltstbr" ] }, "changed": false } 27885 1726882532.22044: no more pending results, returning what we have 27885 1726882532.22047: results queue empty 27885 1726882532.22048: checking for any_errors_fatal 27885 1726882532.22054: done checking for any_errors_fatal 27885 1726882532.22055: checking for max_fail_percentage 27885 1726882532.22056: done checking for max_fail_percentage 27885 1726882532.22057: checking to see if all hosts have failed and the running result is not ok 27885 1726882532.22058: done checking to see if all hosts have failed 27885 1726882532.22058: getting the remaining hosts for this loop 27885 1726882532.22060: done getting the remaining hosts for this loop 27885 1726882532.22063: getting the next task for host managed_node2 27885 1726882532.22070: done getting next task for host managed_node2 27885 1726882532.22072: ^ task is: TASK: Show current_interfaces 27885 1726882532.22076: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882532.22078: getting variables 27885 1726882532.22080: in VariableManager get_vars() 27885 1726882532.22117: Calling all_inventory to load vars for managed_node2 27885 1726882532.22119: Calling groups_inventory to load vars for managed_node2 27885 1726882532.22121: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882532.22129: Calling all_plugins_play to load vars for managed_node2 27885 1726882532.22131: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882532.22133: Calling groups_plugins_play to load vars for managed_node2 27885 1726882532.22333: done sending task result for task 12673a56-9f93-3fa5-01be-0000000001d5 27885 1726882532.22339: WORKER PROCESS EXITING 27885 1726882532.22360: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882532.22578: done with get_vars() 27885 1726882532.22597: done getting variables 27885 1726882532.22647: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:35:32 -0400 (0:00:00.023) 0:00:04.869 ****** 27885 1726882532.22674: entering _queue_task() for managed_node2/debug 27885 1726882532.22922: worker is 1 (out of 1 available) 27885 1726882532.22936: exiting _queue_task() for managed_node2/debug 27885 1726882532.22947: done queuing things up, now waiting for results queue to drain 27885 1726882532.22948: waiting for pending results... 27885 1726882532.23092: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 27885 1726882532.23163: in run() - task 12673a56-9f93-3fa5-01be-00000000019e 27885 1726882532.23175: variable 'ansible_search_path' from source: unknown 27885 1726882532.23180: variable 'ansible_search_path' from source: unknown 27885 1726882532.23209: calling self._execute() 27885 1726882532.23268: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882532.23272: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882532.23280: variable 'omit' from source: magic vars 27885 1726882532.23602: variable 'ansible_distribution_major_version' from source: facts 27885 1726882532.23607: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882532.23609: variable 'omit' from source: magic vars 27885 1726882532.23672: variable 'omit' from source: magic vars 27885 1726882532.23773: variable 'current_interfaces' from source: set_fact 27885 1726882532.23802: variable 'omit' from source: magic vars 27885 1726882532.23843: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882532.23871: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882532.23887: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882532.23930: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882532.23933: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882532.23968: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882532.23971: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882532.23973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882532.24098: Set connection var ansible_pipelining to False 27885 1726882532.24101: Set connection var ansible_connection to ssh 27885 1726882532.24104: Set connection var ansible_timeout to 10 27885 1726882532.24106: Set connection var ansible_shell_type to sh 27885 1726882532.24136: Set connection var ansible_shell_executable to /bin/sh 27885 1726882532.24139: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882532.24141: variable 'ansible_shell_executable' from source: unknown 27885 1726882532.24144: variable 'ansible_connection' from source: unknown 27885 1726882532.24146: variable 'ansible_module_compression' from source: unknown 27885 1726882532.24148: variable 'ansible_shell_type' from source: unknown 27885 1726882532.24150: variable 'ansible_shell_executable' from source: unknown 27885 1726882532.24172: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882532.24178: variable 'ansible_pipelining' from source: unknown 27885 1726882532.24206: variable 'ansible_timeout' from source: unknown 27885 1726882532.24208: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882532.24300: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882532.24308: variable 'omit' from source: magic vars 27885 1726882532.24314: starting attempt loop 27885 1726882532.24325: running the handler 27885 1726882532.24380: handler run complete 27885 1726882532.24386: attempt loop complete, returning result 27885 1726882532.24389: _execute() done 27885 1726882532.24396: dumping result to json 27885 1726882532.24405: done dumping result, returning 27885 1726882532.24429: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [12673a56-9f93-3fa5-01be-00000000019e] 27885 1726882532.24431: sending task result for task 12673a56-9f93-3fa5-01be-00000000019e 27885 1726882532.24530: done sending task result for task 12673a56-9f93-3fa5-01be-00000000019e 27885 1726882532.24533: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo', 'rpltstbr'] 27885 1726882532.24584: no more pending results, returning what we have 27885 1726882532.24587: results queue empty 27885 1726882532.24588: checking for any_errors_fatal 27885 1726882532.24592: done checking for any_errors_fatal 27885 1726882532.24594: checking for max_fail_percentage 27885 1726882532.24596: done checking for max_fail_percentage 27885 1726882532.24596: checking to see if all hosts have failed and the running result is not ok 27885 1726882532.24597: done checking to see if all hosts have failed 27885 1726882532.24598: getting the remaining hosts for this loop 27885 1726882532.24599: done getting the remaining hosts for this loop 27885 1726882532.24602: getting the next task for host managed_node2 27885 1726882532.24608: done getting next task for host managed_node2 27885 1726882532.24610: ^ task is: TASK: Install iproute 27885 1726882532.24614: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882532.24617: getting variables 27885 1726882532.24618: in VariableManager get_vars() 27885 1726882532.24649: Calling all_inventory to load vars for managed_node2 27885 1726882532.24654: Calling groups_inventory to load vars for managed_node2 27885 1726882532.24656: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882532.24665: Calling all_plugins_play to load vars for managed_node2 27885 1726882532.24667: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882532.24670: Calling groups_plugins_play to load vars for managed_node2 27885 1726882532.24796: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882532.24974: done with get_vars() 27885 1726882532.24984: done getting variables 27885 1726882532.25036: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Friday 20 September 2024 21:35:32 -0400 (0:00:00.023) 0:00:04.893 ****** 27885 1726882532.25067: entering _queue_task() for managed_node2/package 27885 1726882532.25271: worker is 1 (out of 1 available) 27885 1726882532.25284: exiting _queue_task() for managed_node2/package 27885 1726882532.25299: done queuing things up, now waiting for results queue to drain 27885 1726882532.25300: waiting for pending results... 27885 1726882532.25479: running TaskExecutor() for managed_node2/TASK: Install iproute 27885 1726882532.25551: in run() - task 12673a56-9f93-3fa5-01be-00000000016d 27885 1726882532.25563: variable 'ansible_search_path' from source: unknown 27885 1726882532.25572: variable 'ansible_search_path' from source: unknown 27885 1726882532.25605: calling self._execute() 27885 1726882532.25666: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882532.25674: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882532.25683: variable 'omit' from source: magic vars 27885 1726882532.25931: variable 'ansible_distribution_major_version' from source: facts 27885 1726882532.25940: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882532.25945: variable 'omit' from source: magic vars 27885 1726882532.25974: variable 'omit' from source: magic vars 27885 1726882532.26176: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27885 1726882532.27839: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27885 1726882532.27882: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27885 1726882532.27917: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27885 1726882532.27942: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27885 1726882532.28257: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27885 1726882532.28407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882532.28440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882532.28481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882532.28546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882532.28562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882532.28799: variable '__network_is_ostree' from source: set_fact 27885 1726882532.28807: variable 'omit' from source: magic vars 27885 1726882532.28811: variable 'omit' from source: magic vars 27885 1726882532.28825: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882532.28868: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882532.28887: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882532.28903: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882532.28912: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882532.28936: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882532.28939: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882532.28943: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882532.29068: Set connection var ansible_pipelining to False 27885 1726882532.29075: Set connection var ansible_connection to ssh 27885 1726882532.29077: Set connection var ansible_timeout to 10 27885 1726882532.29082: Set connection var ansible_shell_type to sh 27885 1726882532.29084: Set connection var ansible_shell_executable to /bin/sh 27885 1726882532.29089: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882532.29155: variable 'ansible_shell_executable' from source: unknown 27885 1726882532.29158: variable 'ansible_connection' from source: unknown 27885 1726882532.29160: variable 'ansible_module_compression' from source: unknown 27885 1726882532.29162: variable 'ansible_shell_type' from source: unknown 27885 1726882532.29164: variable 'ansible_shell_executable' from source: unknown 27885 1726882532.29166: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882532.29167: variable 'ansible_pipelining' from source: unknown 27885 1726882532.29169: variable 'ansible_timeout' from source: unknown 27885 1726882532.29171: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882532.29266: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882532.29299: variable 'omit' from source: magic vars 27885 1726882532.29304: starting attempt loop 27885 1726882532.29307: running the handler 27885 1726882532.29314: variable 'ansible_facts' from source: unknown 27885 1726882532.29318: variable 'ansible_facts' from source: unknown 27885 1726882532.29364: _low_level_execute_command(): starting 27885 1726882532.29367: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27885 1726882532.29974: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882532.30040: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882532.30307: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882532.31869: stdout chunk (state=3): >>>/root <<< 27885 1726882532.31973: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882532.31998: stderr chunk (state=3): >>><<< 27885 1726882532.32002: stdout chunk (state=3): >>><<< 27885 1726882532.32023: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882532.32035: _low_level_execute_command(): starting 27885 1726882532.32042: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882532.3202338-28124-276231795201657 `" && echo ansible-tmp-1726882532.3202338-28124-276231795201657="` echo /root/.ansible/tmp/ansible-tmp-1726882532.3202338-28124-276231795201657 `" ) && sleep 0' 27885 1726882532.32444: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882532.32447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882532.32450: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882532.32452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882532.32505: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882532.32512: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882532.32574: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882532.34520: stdout chunk (state=3): >>>ansible-tmp-1726882532.3202338-28124-276231795201657=/root/.ansible/tmp/ansible-tmp-1726882532.3202338-28124-276231795201657 <<< 27885 1726882532.34673: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882532.34676: stdout chunk (state=3): >>><<< 27885 1726882532.34678: stderr chunk (state=3): >>><<< 27885 1726882532.34765: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882532.3202338-28124-276231795201657=/root/.ansible/tmp/ansible-tmp-1726882532.3202338-28124-276231795201657 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882532.34768: variable 'ansible_module_compression' from source: unknown 27885 1726882532.34818: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 27885 1726882532.34828: ANSIBALLZ: Acquiring lock 27885 1726882532.34835: ANSIBALLZ: Lock acquired: 140560087758944 27885 1726882532.34850: ANSIBALLZ: Creating module 27885 1726882532.48679: ANSIBALLZ: Writing module into payload 27885 1726882532.48878: ANSIBALLZ: Writing module 27885 1726882532.48909: ANSIBALLZ: Renaming module 27885 1726882532.49098: ANSIBALLZ: Done creating module 27885 1726882532.49102: variable 'ansible_facts' from source: unknown 27885 1726882532.49104: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882532.3202338-28124-276231795201657/AnsiballZ_dnf.py 27885 1726882532.49221: Sending initial data 27885 1726882532.49231: Sent initial data (152 bytes) 27885 1726882532.49817: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882532.49832: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882532.49906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882532.49951: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882532.49967: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882532.49991: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882532.50104: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882532.51789: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27885 1726882532.51873: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27885 1726882532.51945: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpffmo7ibi /root/.ansible/tmp/ansible-tmp-1726882532.3202338-28124-276231795201657/AnsiballZ_dnf.py <<< 27885 1726882532.51948: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882532.3202338-28124-276231795201657/AnsiballZ_dnf.py" <<< 27885 1726882532.52021: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpffmo7ibi" to remote "/root/.ansible/tmp/ansible-tmp-1726882532.3202338-28124-276231795201657/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882532.3202338-28124-276231795201657/AnsiballZ_dnf.py" <<< 27885 1726882532.53079: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882532.53113: stderr chunk (state=3): >>><<< 27885 1726882532.53123: stdout chunk (state=3): >>><<< 27885 1726882532.53263: done transferring module to remote 27885 1726882532.53266: _low_level_execute_command(): starting 27885 1726882532.53272: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882532.3202338-28124-276231795201657/ /root/.ansible/tmp/ansible-tmp-1726882532.3202338-28124-276231795201657/AnsiballZ_dnf.py && sleep 0' 27885 1726882532.53799: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882532.53802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882532.53805: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882532.53807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 27885 1726882532.53809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882532.53853: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882532.53856: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882532.53924: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882532.55709: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882532.55774: stderr chunk (state=3): >>><<< 27885 1726882532.55779: stdout chunk (state=3): >>><<< 27885 1726882532.55781: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882532.55783: _low_level_execute_command(): starting 27885 1726882532.55786: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882532.3202338-28124-276231795201657/AnsiballZ_dnf.py && sleep 0' 27885 1726882532.56408: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882532.56412: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882532.56559: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882532.97024: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 27885 1726882533.01302: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 27885 1726882533.01306: stdout chunk (state=3): >>><<< 27885 1726882533.01308: stderr chunk (state=3): >>><<< 27885 1726882533.01311: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 27885 1726882533.01314: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882532.3202338-28124-276231795201657/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27885 1726882533.01323: _low_level_execute_command(): starting 27885 1726882533.01326: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882532.3202338-28124-276231795201657/ > /dev/null 2>&1 && sleep 0' 27885 1726882533.02342: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882533.02358: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882533.02374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882533.02478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882533.02510: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882533.02600: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882533.04447: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882533.04456: stdout chunk (state=3): >>><<< 27885 1726882533.04466: stderr chunk (state=3): >>><<< 27885 1726882533.04483: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882533.04499: handler run complete 27885 1726882533.04660: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27885 1726882533.05011: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27885 1726882533.05014: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27885 1726882533.05016: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27885 1726882533.05018: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27885 1726882533.05029: variable '__install_status' from source: unknown 27885 1726882533.05052: Evaluated conditional (__install_status is success): True 27885 1726882533.05072: attempt loop complete, returning result 27885 1726882533.05083: _execute() done 27885 1726882533.05179: dumping result to json 27885 1726882533.05192: done dumping result, returning 27885 1726882533.05231: done running TaskExecutor() for managed_node2/TASK: Install iproute [12673a56-9f93-3fa5-01be-00000000016d] 27885 1726882533.05338: sending task result for task 12673a56-9f93-3fa5-01be-00000000016d 27885 1726882533.05554: done sending task result for task 12673a56-9f93-3fa5-01be-00000000016d 27885 1726882533.05558: WORKER PROCESS EXITING ok: [managed_node2] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 27885 1726882533.05642: no more pending results, returning what we have 27885 1726882533.05645: results queue empty 27885 1726882533.05646: checking for any_errors_fatal 27885 1726882533.05650: done checking for any_errors_fatal 27885 1726882533.05651: checking for max_fail_percentage 27885 1726882533.05652: done checking for max_fail_percentage 27885 1726882533.05653: checking to see if all hosts have failed and the running result is not ok 27885 1726882533.05653: done checking to see if all hosts have failed 27885 1726882533.05654: getting the remaining hosts for this loop 27885 1726882533.05656: done getting the remaining hosts for this loop 27885 1726882533.05659: getting the next task for host managed_node2 27885 1726882533.05664: done getting next task for host managed_node2 27885 1726882533.05667: ^ task is: TASK: Create veth interface {{ interface }} 27885 1726882533.05670: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882533.05673: getting variables 27885 1726882533.05675: in VariableManager get_vars() 27885 1726882533.05715: Calling all_inventory to load vars for managed_node2 27885 1726882533.05717: Calling groups_inventory to load vars for managed_node2 27885 1726882533.05720: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882533.05729: Calling all_plugins_play to load vars for managed_node2 27885 1726882533.05732: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882533.05734: Calling groups_plugins_play to load vars for managed_node2 27885 1726882533.06556: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882533.06917: done with get_vars() 27885 1726882533.06930: done getting variables 27885 1726882533.07016: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27885 1726882533.07147: variable 'interface' from source: set_fact TASK [Create veth interface ethtest0] ****************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Friday 20 September 2024 21:35:33 -0400 (0:00:00.821) 0:00:05.714 ****** 27885 1726882533.07197: entering _queue_task() for managed_node2/command 27885 1726882533.07471: worker is 1 (out of 1 available) 27885 1726882533.07485: exiting _queue_task() for managed_node2/command 27885 1726882533.07500: done queuing things up, now waiting for results queue to drain 27885 1726882533.07502: waiting for pending results... 27885 1726882533.07760: running TaskExecutor() for managed_node2/TASK: Create veth interface ethtest0 27885 1726882533.07871: in run() - task 12673a56-9f93-3fa5-01be-00000000016e 27885 1726882533.07895: variable 'ansible_search_path' from source: unknown 27885 1726882533.07904: variable 'ansible_search_path' from source: unknown 27885 1726882533.08176: variable 'interface' from source: set_fact 27885 1726882533.08263: variable 'interface' from source: set_fact 27885 1726882533.08349: variable 'interface' from source: set_fact 27885 1726882533.08502: Loaded config def from plugin (lookup/items) 27885 1726882533.08514: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 27885 1726882533.08537: variable 'omit' from source: magic vars 27885 1726882533.08657: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882533.08697: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882533.08700: variable 'omit' from source: magic vars 27885 1726882533.08973: variable 'ansible_distribution_major_version' from source: facts 27885 1726882533.08986: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882533.09198: variable 'type' from source: set_fact 27885 1726882533.09259: variable 'state' from source: include params 27885 1726882533.09266: variable 'interface' from source: set_fact 27885 1726882533.09269: variable 'current_interfaces' from source: set_fact 27885 1726882533.09271: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 27885 1726882533.09273: variable 'omit' from source: magic vars 27885 1726882533.09288: variable 'omit' from source: magic vars 27885 1726882533.09343: variable 'item' from source: unknown 27885 1726882533.09423: variable 'item' from source: unknown 27885 1726882533.09442: variable 'omit' from source: magic vars 27885 1726882533.09481: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882533.09587: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882533.09594: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882533.09597: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882533.09600: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882533.09610: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882533.09617: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882533.09623: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882533.09735: Set connection var ansible_pipelining to False 27885 1726882533.09745: Set connection var ansible_connection to ssh 27885 1726882533.09754: Set connection var ansible_timeout to 10 27885 1726882533.09763: Set connection var ansible_shell_type to sh 27885 1726882533.09775: Set connection var ansible_shell_executable to /bin/sh 27885 1726882533.09783: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882533.09819: variable 'ansible_shell_executable' from source: unknown 27885 1726882533.09826: variable 'ansible_connection' from source: unknown 27885 1726882533.09833: variable 'ansible_module_compression' from source: unknown 27885 1726882533.09898: variable 'ansible_shell_type' from source: unknown 27885 1726882533.09901: variable 'ansible_shell_executable' from source: unknown 27885 1726882533.09903: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882533.09905: variable 'ansible_pipelining' from source: unknown 27885 1726882533.09908: variable 'ansible_timeout' from source: unknown 27885 1726882533.09919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882533.10013: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882533.10037: variable 'omit' from source: magic vars 27885 1726882533.10061: starting attempt loop 27885 1726882533.10068: running the handler 27885 1726882533.10085: _low_level_execute_command(): starting 27885 1726882533.10102: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27885 1726882533.11372: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882533.11504: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882533.11523: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882533.11603: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882533.11717: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882533.13315: stdout chunk (state=3): >>>/root <<< 27885 1726882533.13608: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882533.13707: stdout chunk (state=3): >>><<< 27885 1726882533.13715: stderr chunk (state=3): >>><<< 27885 1726882533.13734: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882533.13747: _low_level_execute_command(): starting 27885 1726882533.13753: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882533.1373382-28162-194178657204762 `" && echo ansible-tmp-1726882533.1373382-28162-194178657204762="` echo /root/.ansible/tmp/ansible-tmp-1726882533.1373382-28162-194178657204762 `" ) && sleep 0' 27885 1726882533.14820: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882533.14823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882533.14899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882533.15097: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 27885 1726882533.15101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882533.15104: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882533.15144: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882533.15212: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882533.17097: stdout chunk (state=3): >>>ansible-tmp-1726882533.1373382-28162-194178657204762=/root/.ansible/tmp/ansible-tmp-1726882533.1373382-28162-194178657204762 <<< 27885 1726882533.17181: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882533.17222: stderr chunk (state=3): >>><<< 27885 1726882533.17227: stdout chunk (state=3): >>><<< 27885 1726882533.17257: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882533.1373382-28162-194178657204762=/root/.ansible/tmp/ansible-tmp-1726882533.1373382-28162-194178657204762 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882533.17297: variable 'ansible_module_compression' from source: unknown 27885 1726882533.17345: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27885_1t3g5hz/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27885 1726882533.17364: variable 'ansible_facts' from source: unknown 27885 1726882533.17420: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882533.1373382-28162-194178657204762/AnsiballZ_command.py 27885 1726882533.17522: Sending initial data 27885 1726882533.17526: Sent initial data (156 bytes) 27885 1726882533.17935: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882533.17938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 27885 1726882533.17941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882533.17943: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882533.17946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882533.17984: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882533.17999: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882533.18066: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882533.19618: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27885 1726882533.19710: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27885 1726882533.19768: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpueqywjyw /root/.ansible/tmp/ansible-tmp-1726882533.1373382-28162-194178657204762/AnsiballZ_command.py <<< 27885 1726882533.19770: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882533.1373382-28162-194178657204762/AnsiballZ_command.py" <<< 27885 1726882533.19825: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpueqywjyw" to remote "/root/.ansible/tmp/ansible-tmp-1726882533.1373382-28162-194178657204762/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882533.1373382-28162-194178657204762/AnsiballZ_command.py" <<< 27885 1726882533.20409: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882533.20450: stderr chunk (state=3): >>><<< 27885 1726882533.20453: stdout chunk (state=3): >>><<< 27885 1726882533.20483: done transferring module to remote 27885 1726882533.20495: _low_level_execute_command(): starting 27885 1726882533.20501: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882533.1373382-28162-194178657204762/ /root/.ansible/tmp/ansible-tmp-1726882533.1373382-28162-194178657204762/AnsiballZ_command.py && sleep 0' 27885 1726882533.20869: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882533.20912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882533.20919: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 27885 1726882533.20921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882533.20924: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882533.20926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882533.20958: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882533.20961: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882533.21028: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882533.22735: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882533.22820: stderr chunk (state=3): >>><<< 27885 1726882533.22824: stdout chunk (state=3): >>><<< 27885 1726882533.22826: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882533.22829: _low_level_execute_command(): starting 27885 1726882533.22831: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882533.1373382-28162-194178657204762/AnsiballZ_command.py && sleep 0' 27885 1726882533.23161: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882533.23164: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882533.23166: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration <<< 27885 1726882533.23168: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882533.23170: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882533.23220: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882533.23223: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882533.23300: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882533.38770: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0"], "start": "2024-09-20 21:35:33.380224", "end": "2024-09-20 21:35:33.385654", "delta": "0:00:00.005430", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest0 type veth peer name peerethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27885 1726882533.40878: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 27885 1726882533.40913: stderr chunk (state=3): >>><<< 27885 1726882533.40916: stdout chunk (state=3): >>><<< 27885 1726882533.40931: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0"], "start": "2024-09-20 21:35:33.380224", "end": "2024-09-20 21:35:33.385654", "delta": "0:00:00.005430", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest0 type veth peer name peerethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 27885 1726882533.40959: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add ethtest0 type veth peer name peerethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882533.1373382-28162-194178657204762/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27885 1726882533.40971: _low_level_execute_command(): starting 27885 1726882533.40974: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882533.1373382-28162-194178657204762/ > /dev/null 2>&1 && sleep 0' 27885 1726882533.41437: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882533.41441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882533.41444: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882533.41446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882533.41496: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882533.41499: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882533.41502: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882533.41570: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882533.45760: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882533.45785: stderr chunk (state=3): >>><<< 27885 1726882533.45790: stdout chunk (state=3): >>><<< 27885 1726882533.45808: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882533.45812: handler run complete 27885 1726882533.45832: Evaluated conditional (False): False 27885 1726882533.45840: attempt loop complete, returning result 27885 1726882533.45855: variable 'item' from source: unknown 27885 1726882533.45918: variable 'item' from source: unknown ok: [managed_node2] => (item=ip link add ethtest0 type veth peer name peerethtest0) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0" ], "delta": "0:00:00.005430", "end": "2024-09-20 21:35:33.385654", "item": "ip link add ethtest0 type veth peer name peerethtest0", "rc": 0, "start": "2024-09-20 21:35:33.380224" } 27885 1726882533.46086: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882533.46088: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882533.46090: variable 'omit' from source: magic vars 27885 1726882533.46189: variable 'ansible_distribution_major_version' from source: facts 27885 1726882533.46199: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882533.46321: variable 'type' from source: set_fact 27885 1726882533.46324: variable 'state' from source: include params 27885 1726882533.46326: variable 'interface' from source: set_fact 27885 1726882533.46329: variable 'current_interfaces' from source: set_fact 27885 1726882533.46336: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 27885 1726882533.46339: variable 'omit' from source: magic vars 27885 1726882533.46351: variable 'omit' from source: magic vars 27885 1726882533.46376: variable 'item' from source: unknown 27885 1726882533.46426: variable 'item' from source: unknown 27885 1726882533.46438: variable 'omit' from source: magic vars 27885 1726882533.46455: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882533.46462: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882533.46468: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882533.46478: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882533.46481: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882533.46483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882533.46543: Set connection var ansible_pipelining to False 27885 1726882533.46546: Set connection var ansible_connection to ssh 27885 1726882533.46548: Set connection var ansible_timeout to 10 27885 1726882533.46551: Set connection var ansible_shell_type to sh 27885 1726882533.46555: Set connection var ansible_shell_executable to /bin/sh 27885 1726882533.46560: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882533.46575: variable 'ansible_shell_executable' from source: unknown 27885 1726882533.46578: variable 'ansible_connection' from source: unknown 27885 1726882533.46580: variable 'ansible_module_compression' from source: unknown 27885 1726882533.46583: variable 'ansible_shell_type' from source: unknown 27885 1726882533.46585: variable 'ansible_shell_executable' from source: unknown 27885 1726882533.46587: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882533.46591: variable 'ansible_pipelining' from source: unknown 27885 1726882533.46598: variable 'ansible_timeout' from source: unknown 27885 1726882533.46602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882533.46669: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882533.46676: variable 'omit' from source: magic vars 27885 1726882533.46680: starting attempt loop 27885 1726882533.46682: running the handler 27885 1726882533.46689: _low_level_execute_command(): starting 27885 1726882533.46697: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27885 1726882533.47143: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882533.47146: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 27885 1726882533.47152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882533.47155: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882533.47157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 27885 1726882533.47159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882533.47198: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882533.47213: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882533.47285: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882533.48865: stdout chunk (state=3): >>>/root <<< 27885 1726882533.48962: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882533.48988: stderr chunk (state=3): >>><<< 27885 1726882533.48995: stdout chunk (state=3): >>><<< 27885 1726882533.49006: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882533.49016: _low_level_execute_command(): starting 27885 1726882533.49022: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882533.490062-28162-54948568331581 `" && echo ansible-tmp-1726882533.490062-28162-54948568331581="` echo /root/.ansible/tmp/ansible-tmp-1726882533.490062-28162-54948568331581 `" ) && sleep 0' 27885 1726882533.49454: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882533.49457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 27885 1726882533.49460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882533.49463: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882533.49465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882533.49517: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882533.49520: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882533.49525: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882533.49587: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882533.51435: stdout chunk (state=3): >>>ansible-tmp-1726882533.490062-28162-54948568331581=/root/.ansible/tmp/ansible-tmp-1726882533.490062-28162-54948568331581 <<< 27885 1726882533.51537: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882533.51560: stderr chunk (state=3): >>><<< 27885 1726882533.51563: stdout chunk (state=3): >>><<< 27885 1726882533.51575: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882533.490062-28162-54948568331581=/root/.ansible/tmp/ansible-tmp-1726882533.490062-28162-54948568331581 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882533.51597: variable 'ansible_module_compression' from source: unknown 27885 1726882533.51628: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27885_1t3g5hz/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27885 1726882533.51642: variable 'ansible_facts' from source: unknown 27885 1726882533.51686: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882533.490062-28162-54948568331581/AnsiballZ_command.py 27885 1726882533.51774: Sending initial data 27885 1726882533.51778: Sent initial data (154 bytes) 27885 1726882533.52177: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882533.52191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 27885 1726882533.52213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882533.52216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882533.52265: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882533.52268: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882533.52333: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882533.53844: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 27885 1726882533.53848: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27885 1726882533.53909: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27885 1726882533.53970: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpc4apu8im /root/.ansible/tmp/ansible-tmp-1726882533.490062-28162-54948568331581/AnsiballZ_command.py <<< 27885 1726882533.53976: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882533.490062-28162-54948568331581/AnsiballZ_command.py" <<< 27885 1726882533.54034: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpc4apu8im" to remote "/root/.ansible/tmp/ansible-tmp-1726882533.490062-28162-54948568331581/AnsiballZ_command.py" <<< 27885 1726882533.54037: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882533.490062-28162-54948568331581/AnsiballZ_command.py" <<< 27885 1726882533.54636: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882533.54673: stderr chunk (state=3): >>><<< 27885 1726882533.54676: stdout chunk (state=3): >>><<< 27885 1726882533.54701: done transferring module to remote 27885 1726882533.54707: _low_level_execute_command(): starting 27885 1726882533.54711: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882533.490062-28162-54948568331581/ /root/.ansible/tmp/ansible-tmp-1726882533.490062-28162-54948568331581/AnsiballZ_command.py && sleep 0' 27885 1726882533.55132: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882533.55136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882533.55138: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882533.55140: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882533.55142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 27885 1726882533.55144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882533.55185: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882533.55190: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882533.55257: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882533.56961: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882533.56983: stderr chunk (state=3): >>><<< 27885 1726882533.56986: stdout chunk (state=3): >>><<< 27885 1726882533.56999: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882533.57002: _low_level_execute_command(): starting 27885 1726882533.57007: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882533.490062-28162-54948568331581/AnsiballZ_command.py && sleep 0' 27885 1726882533.57423: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882533.57426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882533.57428: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882533.57430: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882533.57432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882533.57435: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882533.57478: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882533.57482: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882533.57553: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882533.72863: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest0", "up"], "start": "2024-09-20 21:35:33.724420", "end": "2024-09-20 21:35:33.727783", "delta": "0:00:00.003363", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27885 1726882533.74387: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 27885 1726882533.74395: stdout chunk (state=3): >>><<< 27885 1726882533.74398: stderr chunk (state=3): >>><<< 27885 1726882533.74510: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest0", "up"], "start": "2024-09-20 21:35:33.724420", "end": "2024-09-20 21:35:33.727783", "delta": "0:00:00.003363", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 27885 1726882533.74544: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerethtest0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882533.490062-28162-54948568331581/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27885 1726882533.74548: _low_level_execute_command(): starting 27885 1726882533.74556: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882533.490062-28162-54948568331581/ > /dev/null 2>&1 && sleep 0' 27885 1726882533.75884: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882533.76099: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882533.76102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882533.76105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882533.76129: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882533.76152: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882533.76283: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882533.78108: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882533.78111: stdout chunk (state=3): >>><<< 27885 1726882533.78176: stderr chunk (state=3): >>><<< 27885 1726882533.78196: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882533.78200: handler run complete 27885 1726882533.78283: Evaluated conditional (False): False 27885 1726882533.78297: attempt loop complete, returning result 27885 1726882533.78318: variable 'item' from source: unknown 27885 1726882533.78528: variable 'item' from source: unknown ok: [managed_node2] => (item=ip link set peerethtest0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerethtest0", "up" ], "delta": "0:00:00.003363", "end": "2024-09-20 21:35:33.727783", "item": "ip link set peerethtest0 up", "rc": 0, "start": "2024-09-20 21:35:33.724420" } 27885 1726882533.78798: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882533.78802: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882533.78804: variable 'omit' from source: magic vars 27885 1726882533.78851: variable 'ansible_distribution_major_version' from source: facts 27885 1726882533.78856: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882533.79081: variable 'type' from source: set_fact 27885 1726882533.79084: variable 'state' from source: include params 27885 1726882533.79087: variable 'interface' from source: set_fact 27885 1726882533.79096: variable 'current_interfaces' from source: set_fact 27885 1726882533.79101: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 27885 1726882533.79105: variable 'omit' from source: magic vars 27885 1726882533.79117: variable 'omit' from source: magic vars 27885 1726882533.79143: variable 'item' from source: unknown 27885 1726882533.79198: variable 'item' from source: unknown 27885 1726882533.79211: variable 'omit' from source: magic vars 27885 1726882533.79228: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882533.79235: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882533.79242: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882533.79251: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882533.79255: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882533.79263: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882533.79315: Set connection var ansible_pipelining to False 27885 1726882533.79319: Set connection var ansible_connection to ssh 27885 1726882533.79324: Set connection var ansible_timeout to 10 27885 1726882533.79326: Set connection var ansible_shell_type to sh 27885 1726882533.79331: Set connection var ansible_shell_executable to /bin/sh 27885 1726882533.79336: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882533.79351: variable 'ansible_shell_executable' from source: unknown 27885 1726882533.79354: variable 'ansible_connection' from source: unknown 27885 1726882533.79356: variable 'ansible_module_compression' from source: unknown 27885 1726882533.79358: variable 'ansible_shell_type' from source: unknown 27885 1726882533.79360: variable 'ansible_shell_executable' from source: unknown 27885 1726882533.79363: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882533.79373: variable 'ansible_pipelining' from source: unknown 27885 1726882533.79375: variable 'ansible_timeout' from source: unknown 27885 1726882533.79379: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882533.79443: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882533.79450: variable 'omit' from source: magic vars 27885 1726882533.79453: starting attempt loop 27885 1726882533.79455: running the handler 27885 1726882533.79462: _low_level_execute_command(): starting 27885 1726882533.79465: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27885 1726882533.79875: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882533.79912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882533.79915: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 27885 1726882533.79917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882533.79919: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882533.79921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882533.79923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882533.79972: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882533.79976: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882533.80044: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882533.81620: stdout chunk (state=3): >>>/root <<< 27885 1726882533.81715: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882533.81799: stderr chunk (state=3): >>><<< 27885 1726882533.81802: stdout chunk (state=3): >>><<< 27885 1726882533.81805: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882533.81807: _low_level_execute_command(): starting 27885 1726882533.81810: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882533.8176472-28162-31075597045301 `" && echo ansible-tmp-1726882533.8176472-28162-31075597045301="` echo /root/.ansible/tmp/ansible-tmp-1726882533.8176472-28162-31075597045301 `" ) && sleep 0' 27885 1726882533.82386: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882533.82392: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882533.82397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882533.82399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882533.82401: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882533.82442: stderr chunk (state=3): >>>debug2: match not found <<< 27885 1726882533.82445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882533.82448: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27885 1726882533.82450: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 27885 1726882533.82452: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27885 1726882533.82454: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882533.82460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882533.82479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882533.82482: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882533.82485: stderr chunk (state=3): >>>debug2: match found <<< 27885 1726882533.82498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882533.82557: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882533.82589: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882533.82597: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882533.82672: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882533.84669: stdout chunk (state=3): >>>ansible-tmp-1726882533.8176472-28162-31075597045301=/root/.ansible/tmp/ansible-tmp-1726882533.8176472-28162-31075597045301 <<< 27885 1726882533.84711: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882533.84747: stderr chunk (state=3): >>><<< 27885 1726882533.84887: stdout chunk (state=3): >>><<< 27885 1726882533.84896: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882533.8176472-28162-31075597045301=/root/.ansible/tmp/ansible-tmp-1726882533.8176472-28162-31075597045301 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882533.84899: variable 'ansible_module_compression' from source: unknown 27885 1726882533.84901: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27885_1t3g5hz/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27885 1726882533.85004: variable 'ansible_facts' from source: unknown 27885 1726882533.85162: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882533.8176472-28162-31075597045301/AnsiballZ_command.py 27885 1726882533.85622: Sending initial data 27885 1726882533.85625: Sent initial data (155 bytes) 27885 1726882533.86364: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882533.86407: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882533.86433: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882533.86448: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882533.86451: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882533.86577: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882533.88098: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27885 1726882533.88141: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27885 1726882533.88228: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmp0hgqewn8 /root/.ansible/tmp/ansible-tmp-1726882533.8176472-28162-31075597045301/AnsiballZ_command.py <<< 27885 1726882533.88231: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882533.8176472-28162-31075597045301/AnsiballZ_command.py" <<< 27885 1726882533.88380: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmp0hgqewn8" to remote "/root/.ansible/tmp/ansible-tmp-1726882533.8176472-28162-31075597045301/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882533.8176472-28162-31075597045301/AnsiballZ_command.py" <<< 27885 1726882533.90000: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882533.90004: stdout chunk (state=3): >>><<< 27885 1726882533.90006: stderr chunk (state=3): >>><<< 27885 1726882533.90010: done transferring module to remote 27885 1726882533.90012: _low_level_execute_command(): starting 27885 1726882533.90015: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882533.8176472-28162-31075597045301/ /root/.ansible/tmp/ansible-tmp-1726882533.8176472-28162-31075597045301/AnsiballZ_command.py && sleep 0' 27885 1726882533.91211: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882533.91275: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882533.91315: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882533.91318: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882533.91485: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882533.93237: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882533.93246: stdout chunk (state=3): >>><<< 27885 1726882533.93289: stderr chunk (state=3): >>><<< 27885 1726882533.93314: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882533.93466: _low_level_execute_command(): starting 27885 1726882533.93469: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882533.8176472-28162-31075597045301/AnsiballZ_command.py && sleep 0' 27885 1726882533.94067: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882533.94082: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882533.94108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882533.94127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882533.94147: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882533.94250: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882533.94291: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882533.94360: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882534.09714: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest0", "up"], "start": "2024-09-20 21:35:34.092001", "end": "2024-09-20 21:35:34.095477", "delta": "0:00:00.003476", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27885 1726882534.11097: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 27885 1726882534.11102: stdout chunk (state=3): >>><<< 27885 1726882534.11104: stderr chunk (state=3): >>><<< 27885 1726882534.11124: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest0", "up"], "start": "2024-09-20 21:35:34.092001", "end": "2024-09-20 21:35:34.095477", "delta": "0:00:00.003476", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 27885 1726882534.11205: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set ethtest0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882533.8176472-28162-31075597045301/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27885 1726882534.11209: _low_level_execute_command(): starting 27885 1726882534.11212: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882533.8176472-28162-31075597045301/ > /dev/null 2>&1 && sleep 0' 27885 1726882534.11766: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882534.11782: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882534.11799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882534.11817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882534.11834: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882534.11846: stderr chunk (state=3): >>>debug2: match not found <<< 27885 1726882534.11860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882534.11912: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882534.11964: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882534.11987: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882534.12003: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882534.12096: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882534.13956: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882534.14005: stdout chunk (state=3): >>><<< 27885 1726882534.14017: stderr chunk (state=3): >>><<< 27885 1726882534.14037: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882534.14400: handler run complete 27885 1726882534.14403: Evaluated conditional (False): False 27885 1726882534.14405: attempt loop complete, returning result 27885 1726882534.14407: variable 'item' from source: unknown 27885 1726882534.14409: variable 'item' from source: unknown ok: [managed_node2] => (item=ip link set ethtest0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "ethtest0", "up" ], "delta": "0:00:00.003476", "end": "2024-09-20 21:35:34.095477", "item": "ip link set ethtest0 up", "rc": 0, "start": "2024-09-20 21:35:34.092001" } 27885 1726882534.14509: dumping result to json 27885 1726882534.14512: done dumping result, returning 27885 1726882534.14514: done running TaskExecutor() for managed_node2/TASK: Create veth interface ethtest0 [12673a56-9f93-3fa5-01be-00000000016e] 27885 1726882534.14898: sending task result for task 12673a56-9f93-3fa5-01be-00000000016e 27885 1726882534.14949: done sending task result for task 12673a56-9f93-3fa5-01be-00000000016e 27885 1726882534.14952: WORKER PROCESS EXITING 27885 1726882534.15024: no more pending results, returning what we have 27885 1726882534.15027: results queue empty 27885 1726882534.15028: checking for any_errors_fatal 27885 1726882534.15033: done checking for any_errors_fatal 27885 1726882534.15033: checking for max_fail_percentage 27885 1726882534.15035: done checking for max_fail_percentage 27885 1726882534.15035: checking to see if all hosts have failed and the running result is not ok 27885 1726882534.15036: done checking to see if all hosts have failed 27885 1726882534.15037: getting the remaining hosts for this loop 27885 1726882534.15039: done getting the remaining hosts for this loop 27885 1726882534.15042: getting the next task for host managed_node2 27885 1726882534.15048: done getting next task for host managed_node2 27885 1726882534.15050: ^ task is: TASK: Set up veth as managed by NetworkManager 27885 1726882534.15054: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882534.15058: getting variables 27885 1726882534.15060: in VariableManager get_vars() 27885 1726882534.15107: Calling all_inventory to load vars for managed_node2 27885 1726882534.15110: Calling groups_inventory to load vars for managed_node2 27885 1726882534.15112: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882534.15124: Calling all_plugins_play to load vars for managed_node2 27885 1726882534.15127: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882534.15130: Calling groups_plugins_play to load vars for managed_node2 27885 1726882534.15860: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882534.16401: done with get_vars() 27885 1726882534.16413: done getting variables 27885 1726882534.16469: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Friday 20 September 2024 21:35:34 -0400 (0:00:01.093) 0:00:06.807 ****** 27885 1726882534.16501: entering _queue_task() for managed_node2/command 27885 1726882534.17183: worker is 1 (out of 1 available) 27885 1726882534.17200: exiting _queue_task() for managed_node2/command 27885 1726882534.17212: done queuing things up, now waiting for results queue to drain 27885 1726882534.17214: waiting for pending results... 27885 1726882534.17625: running TaskExecutor() for managed_node2/TASK: Set up veth as managed by NetworkManager 27885 1726882534.17706: in run() - task 12673a56-9f93-3fa5-01be-00000000016f 27885 1726882534.17836: variable 'ansible_search_path' from source: unknown 27885 1726882534.17840: variable 'ansible_search_path' from source: unknown 27885 1726882534.17875: calling self._execute() 27885 1726882534.18097: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882534.18105: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882534.18114: variable 'omit' from source: magic vars 27885 1726882534.18798: variable 'ansible_distribution_major_version' from source: facts 27885 1726882534.19017: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882534.19183: variable 'type' from source: set_fact 27885 1726882534.19196: variable 'state' from source: include params 27885 1726882534.19242: Evaluated conditional (type == 'veth' and state == 'present'): True 27885 1726882534.19254: variable 'omit' from source: magic vars 27885 1726882534.19296: variable 'omit' from source: magic vars 27885 1726882534.19438: variable 'interface' from source: set_fact 27885 1726882534.19579: variable 'omit' from source: magic vars 27885 1726882534.19779: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882534.19783: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882534.19805: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882534.19828: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882534.19903: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882534.19939: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882534.19949: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882534.20005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882534.20212: Set connection var ansible_pipelining to False 27885 1726882534.20229: Set connection var ansible_connection to ssh 27885 1726882534.20234: Set connection var ansible_timeout to 10 27885 1726882534.20241: Set connection var ansible_shell_type to sh 27885 1726882534.20251: Set connection var ansible_shell_executable to /bin/sh 27885 1726882534.20436: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882534.20439: variable 'ansible_shell_executable' from source: unknown 27885 1726882534.20442: variable 'ansible_connection' from source: unknown 27885 1726882534.20445: variable 'ansible_module_compression' from source: unknown 27885 1726882534.20448: variable 'ansible_shell_type' from source: unknown 27885 1726882534.20450: variable 'ansible_shell_executable' from source: unknown 27885 1726882534.20453: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882534.20455: variable 'ansible_pipelining' from source: unknown 27885 1726882534.20458: variable 'ansible_timeout' from source: unknown 27885 1726882534.20460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882534.20694: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882534.20976: variable 'omit' from source: magic vars 27885 1726882534.20979: starting attempt loop 27885 1726882534.20981: running the handler 27885 1726882534.20984: _low_level_execute_command(): starting 27885 1726882534.20986: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27885 1726882534.22314: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882534.22435: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882534.22495: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882534.24057: stdout chunk (state=3): >>>/root <<< 27885 1726882534.24202: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882534.24214: stdout chunk (state=3): >>><<< 27885 1726882534.24226: stderr chunk (state=3): >>><<< 27885 1726882534.24463: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882534.24468: _low_level_execute_command(): starting 27885 1726882534.24471: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882534.243688-28213-64764310272296 `" && echo ansible-tmp-1726882534.243688-28213-64764310272296="` echo /root/.ansible/tmp/ansible-tmp-1726882534.243688-28213-64764310272296 `" ) && sleep 0' 27885 1726882534.25947: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882534.25951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882534.25960: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration <<< 27885 1726882534.25962: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882534.25964: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882534.26453: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882534.26667: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882534.28339: stdout chunk (state=3): >>>ansible-tmp-1726882534.243688-28213-64764310272296=/root/.ansible/tmp/ansible-tmp-1726882534.243688-28213-64764310272296 <<< 27885 1726882534.28438: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882534.28477: stderr chunk (state=3): >>><<< 27885 1726882534.28508: stdout chunk (state=3): >>><<< 27885 1726882534.28531: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882534.243688-28213-64764310272296=/root/.ansible/tmp/ansible-tmp-1726882534.243688-28213-64764310272296 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882534.28563: variable 'ansible_module_compression' from source: unknown 27885 1726882534.28730: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27885_1t3g5hz/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27885 1726882534.28765: variable 'ansible_facts' from source: unknown 27885 1726882534.28996: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882534.243688-28213-64764310272296/AnsiballZ_command.py 27885 1726882534.29281: Sending initial data 27885 1726882534.29286: Sent initial data (154 bytes) 27885 1726882534.30581: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882534.30697: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882534.30724: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882534.30849: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882534.32362: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27885 1726882534.32451: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27885 1726882534.32521: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpjn6vto6n /root/.ansible/tmp/ansible-tmp-1726882534.243688-28213-64764310272296/AnsiballZ_command.py <<< 27885 1726882534.32539: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882534.243688-28213-64764310272296/AnsiballZ_command.py" <<< 27885 1726882534.32580: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpjn6vto6n" to remote "/root/.ansible/tmp/ansible-tmp-1726882534.243688-28213-64764310272296/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882534.243688-28213-64764310272296/AnsiballZ_command.py" <<< 27885 1726882534.33429: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882534.33437: stdout chunk (state=3): >>><<< 27885 1726882534.33455: stderr chunk (state=3): >>><<< 27885 1726882534.33489: done transferring module to remote 27885 1726882534.33512: _low_level_execute_command(): starting 27885 1726882534.33520: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882534.243688-28213-64764310272296/ /root/.ansible/tmp/ansible-tmp-1726882534.243688-28213-64764310272296/AnsiballZ_command.py && sleep 0' 27885 1726882534.34131: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882534.34144: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882534.34157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882534.34171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882534.34187: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882534.34200: stderr chunk (state=3): >>>debug2: match not found <<< 27885 1726882534.34213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882534.34230: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27885 1726882534.34244: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 27885 1726882534.34334: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882534.34360: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882534.34614: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882534.36195: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882534.36206: stdout chunk (state=3): >>><<< 27885 1726882534.36217: stderr chunk (state=3): >>><<< 27885 1726882534.36234: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882534.36243: _low_level_execute_command(): starting 27885 1726882534.36252: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882534.243688-28213-64764310272296/AnsiballZ_command.py && sleep 0' 27885 1726882534.36816: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882534.36832: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882534.36846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882534.36864: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882534.36880: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882534.36898: stderr chunk (state=3): >>>debug2: match not found <<< 27885 1726882534.36914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882534.37000: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882534.37018: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882534.37034: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882534.37308: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882534.54142: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest0", "managed", "true"], "start": "2024-09-20 21:35:34.521642", "end": "2024-09-20 21:35:34.539092", "delta": "0:00:00.017450", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27885 1726882534.55724: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 27885 1726882534.55728: stdout chunk (state=3): >>><<< 27885 1726882534.55730: stderr chunk (state=3): >>><<< 27885 1726882534.55747: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest0", "managed", "true"], "start": "2024-09-20 21:35:34.521642", "end": "2024-09-20 21:35:34.539092", "delta": "0:00:00.017450", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 27885 1726882534.55794: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set ethtest0 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882534.243688-28213-64764310272296/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27885 1726882534.55980: _low_level_execute_command(): starting 27885 1726882534.55984: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882534.243688-28213-64764310272296/ > /dev/null 2>&1 && sleep 0' 27885 1726882534.56560: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882534.56575: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882534.56587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882534.56611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882534.56628: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882534.56639: stderr chunk (state=3): >>>debug2: match not found <<< 27885 1726882534.56651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882534.56712: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882534.56754: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882534.56778: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882534.56801: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882534.56896: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882534.58855: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882534.58858: stdout chunk (state=3): >>><<< 27885 1726882534.58860: stderr chunk (state=3): >>><<< 27885 1726882534.58902: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882534.58905: handler run complete 27885 1726882534.58920: Evaluated conditional (False): False 27885 1726882534.59099: attempt loop complete, returning result 27885 1726882534.59102: _execute() done 27885 1726882534.59104: dumping result to json 27885 1726882534.59106: done dumping result, returning 27885 1726882534.59108: done running TaskExecutor() for managed_node2/TASK: Set up veth as managed by NetworkManager [12673a56-9f93-3fa5-01be-00000000016f] 27885 1726882534.59110: sending task result for task 12673a56-9f93-3fa5-01be-00000000016f 27885 1726882534.59178: done sending task result for task 12673a56-9f93-3fa5-01be-00000000016f 27885 1726882534.59181: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "nmcli", "d", "set", "ethtest0", "managed", "true" ], "delta": "0:00:00.017450", "end": "2024-09-20 21:35:34.539092", "rc": 0, "start": "2024-09-20 21:35:34.521642" } 27885 1726882534.59266: no more pending results, returning what we have 27885 1726882534.59270: results queue empty 27885 1726882534.59270: checking for any_errors_fatal 27885 1726882534.59280: done checking for any_errors_fatal 27885 1726882534.59280: checking for max_fail_percentage 27885 1726882534.59283: done checking for max_fail_percentage 27885 1726882534.59284: checking to see if all hosts have failed and the running result is not ok 27885 1726882534.59285: done checking to see if all hosts have failed 27885 1726882534.59285: getting the remaining hosts for this loop 27885 1726882534.59287: done getting the remaining hosts for this loop 27885 1726882534.59291: getting the next task for host managed_node2 27885 1726882534.59299: done getting next task for host managed_node2 27885 1726882534.59301: ^ task is: TASK: Delete veth interface {{ interface }} 27885 1726882534.59304: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882534.59312: getting variables 27885 1726882534.59314: in VariableManager get_vars() 27885 1726882534.59353: Calling all_inventory to load vars for managed_node2 27885 1726882534.59355: Calling groups_inventory to load vars for managed_node2 27885 1726882534.59357: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882534.59367: Calling all_plugins_play to load vars for managed_node2 27885 1726882534.59370: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882534.59372: Calling groups_plugins_play to load vars for managed_node2 27885 1726882534.59774: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882534.60160: done with get_vars() 27885 1726882534.60171: done getting variables 27885 1726882534.60334: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27885 1726882534.60460: variable 'interface' from source: set_fact TASK [Delete veth interface ethtest0] ****************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Friday 20 September 2024 21:35:34 -0400 (0:00:00.439) 0:00:07.247 ****** 27885 1726882534.60491: entering _queue_task() for managed_node2/command 27885 1726882534.60802: worker is 1 (out of 1 available) 27885 1726882534.60814: exiting _queue_task() for managed_node2/command 27885 1726882534.60943: done queuing things up, now waiting for results queue to drain 27885 1726882534.60945: waiting for pending results... 27885 1726882534.61168: running TaskExecutor() for managed_node2/TASK: Delete veth interface ethtest0 27885 1726882534.61233: in run() - task 12673a56-9f93-3fa5-01be-000000000170 27885 1726882534.61253: variable 'ansible_search_path' from source: unknown 27885 1726882534.61271: variable 'ansible_search_path' from source: unknown 27885 1726882534.61314: calling self._execute() 27885 1726882534.61414: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882534.61484: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882534.61487: variable 'omit' from source: magic vars 27885 1726882534.61798: variable 'ansible_distribution_major_version' from source: facts 27885 1726882534.61824: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882534.62029: variable 'type' from source: set_fact 27885 1726882534.62041: variable 'state' from source: include params 27885 1726882534.62051: variable 'interface' from source: set_fact 27885 1726882534.62060: variable 'current_interfaces' from source: set_fact 27885 1726882534.62074: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 27885 1726882534.62084: when evaluation is False, skipping this task 27885 1726882534.62092: _execute() done 27885 1726882534.62138: dumping result to json 27885 1726882534.62141: done dumping result, returning 27885 1726882534.62144: done running TaskExecutor() for managed_node2/TASK: Delete veth interface ethtest0 [12673a56-9f93-3fa5-01be-000000000170] 27885 1726882534.62147: sending task result for task 12673a56-9f93-3fa5-01be-000000000170 skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 27885 1726882534.62285: no more pending results, returning what we have 27885 1726882534.62290: results queue empty 27885 1726882534.62291: checking for any_errors_fatal 27885 1726882534.62302: done checking for any_errors_fatal 27885 1726882534.62303: checking for max_fail_percentage 27885 1726882534.62304: done checking for max_fail_percentage 27885 1726882534.62305: checking to see if all hosts have failed and the running result is not ok 27885 1726882534.62306: done checking to see if all hosts have failed 27885 1726882534.62307: getting the remaining hosts for this loop 27885 1726882534.62308: done getting the remaining hosts for this loop 27885 1726882534.62312: getting the next task for host managed_node2 27885 1726882534.62319: done getting next task for host managed_node2 27885 1726882534.62321: ^ task is: TASK: Create dummy interface {{ interface }} 27885 1726882534.62326: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882534.62330: getting variables 27885 1726882534.62332: in VariableManager get_vars() 27885 1726882534.62372: Calling all_inventory to load vars for managed_node2 27885 1726882534.62376: Calling groups_inventory to load vars for managed_node2 27885 1726882534.62378: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882534.62391: Calling all_plugins_play to load vars for managed_node2 27885 1726882534.62611: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882534.62617: Calling groups_plugins_play to load vars for managed_node2 27885 1726882534.62850: done sending task result for task 12673a56-9f93-3fa5-01be-000000000170 27885 1726882534.62854: WORKER PROCESS EXITING 27885 1726882534.62876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882534.63097: done with get_vars() 27885 1726882534.63106: done getting variables 27885 1726882534.63168: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27885 1726882534.63274: variable 'interface' from source: set_fact TASK [Create dummy interface ethtest0] ***************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Friday 20 September 2024 21:35:34 -0400 (0:00:00.028) 0:00:07.275 ****** 27885 1726882534.63303: entering _queue_task() for managed_node2/command 27885 1726882534.63542: worker is 1 (out of 1 available) 27885 1726882534.63554: exiting _queue_task() for managed_node2/command 27885 1726882534.63566: done queuing things up, now waiting for results queue to drain 27885 1726882534.63568: waiting for pending results... 27885 1726882534.63833: running TaskExecutor() for managed_node2/TASK: Create dummy interface ethtest0 27885 1726882534.63938: in run() - task 12673a56-9f93-3fa5-01be-000000000171 27885 1726882534.63959: variable 'ansible_search_path' from source: unknown 27885 1726882534.63967: variable 'ansible_search_path' from source: unknown 27885 1726882534.64006: calling self._execute() 27885 1726882534.64400: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882534.64405: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882534.64408: variable 'omit' from source: magic vars 27885 1726882534.64998: variable 'ansible_distribution_major_version' from source: facts 27885 1726882534.65014: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882534.65355: variable 'type' from source: set_fact 27885 1726882534.65598: variable 'state' from source: include params 27885 1726882534.65603: variable 'interface' from source: set_fact 27885 1726882534.65606: variable 'current_interfaces' from source: set_fact 27885 1726882534.65609: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 27885 1726882534.65611: when evaluation is False, skipping this task 27885 1726882534.65613: _execute() done 27885 1726882534.65615: dumping result to json 27885 1726882534.65617: done dumping result, returning 27885 1726882534.65619: done running TaskExecutor() for managed_node2/TASK: Create dummy interface ethtest0 [12673a56-9f93-3fa5-01be-000000000171] 27885 1726882534.65621: sending task result for task 12673a56-9f93-3fa5-01be-000000000171 27885 1726882534.65681: done sending task result for task 12673a56-9f93-3fa5-01be-000000000171 27885 1726882534.65684: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 27885 1726882534.65733: no more pending results, returning what we have 27885 1726882534.65737: results queue empty 27885 1726882534.65738: checking for any_errors_fatal 27885 1726882534.65743: done checking for any_errors_fatal 27885 1726882534.65744: checking for max_fail_percentage 27885 1726882534.65745: done checking for max_fail_percentage 27885 1726882534.65746: checking to see if all hosts have failed and the running result is not ok 27885 1726882534.65747: done checking to see if all hosts have failed 27885 1726882534.65748: getting the remaining hosts for this loop 27885 1726882534.65749: done getting the remaining hosts for this loop 27885 1726882534.65753: getting the next task for host managed_node2 27885 1726882534.65760: done getting next task for host managed_node2 27885 1726882534.65762: ^ task is: TASK: Delete dummy interface {{ interface }} 27885 1726882534.65766: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882534.65770: getting variables 27885 1726882534.65772: in VariableManager get_vars() 27885 1726882534.65814: Calling all_inventory to load vars for managed_node2 27885 1726882534.65817: Calling groups_inventory to load vars for managed_node2 27885 1726882534.65820: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882534.65832: Calling all_plugins_play to load vars for managed_node2 27885 1726882534.65835: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882534.65838: Calling groups_plugins_play to load vars for managed_node2 27885 1726882534.66370: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882534.66619: done with get_vars() 27885 1726882534.66629: done getting variables 27885 1726882534.66695: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27885 1726882534.66806: variable 'interface' from source: set_fact TASK [Delete dummy interface ethtest0] ***************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Friday 20 September 2024 21:35:34 -0400 (0:00:00.035) 0:00:07.310 ****** 27885 1726882534.66834: entering _queue_task() for managed_node2/command 27885 1726882534.67081: worker is 1 (out of 1 available) 27885 1726882534.67209: exiting _queue_task() for managed_node2/command 27885 1726882534.67225: done queuing things up, now waiting for results queue to drain 27885 1726882534.67226: waiting for pending results... 27885 1726882534.67511: running TaskExecutor() for managed_node2/TASK: Delete dummy interface ethtest0 27885 1726882534.67515: in run() - task 12673a56-9f93-3fa5-01be-000000000172 27885 1726882534.67519: variable 'ansible_search_path' from source: unknown 27885 1726882534.67600: variable 'ansible_search_path' from source: unknown 27885 1726882534.67605: calling self._execute() 27885 1726882534.67663: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882534.67675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882534.67690: variable 'omit' from source: magic vars 27885 1726882534.68080: variable 'ansible_distribution_major_version' from source: facts 27885 1726882534.68084: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882534.68308: variable 'type' from source: set_fact 27885 1726882534.68374: variable 'state' from source: include params 27885 1726882534.68377: variable 'interface' from source: set_fact 27885 1726882534.68380: variable 'current_interfaces' from source: set_fact 27885 1726882534.68382: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 27885 1726882534.68385: when evaluation is False, skipping this task 27885 1726882534.68387: _execute() done 27885 1726882534.68389: dumping result to json 27885 1726882534.68391: done dumping result, returning 27885 1726882534.68395: done running TaskExecutor() for managed_node2/TASK: Delete dummy interface ethtest0 [12673a56-9f93-3fa5-01be-000000000172] 27885 1726882534.68397: sending task result for task 12673a56-9f93-3fa5-01be-000000000172 skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 27885 1726882534.68614: no more pending results, returning what we have 27885 1726882534.68625: results queue empty 27885 1726882534.68626: checking for any_errors_fatal 27885 1726882534.68633: done checking for any_errors_fatal 27885 1726882534.68633: checking for max_fail_percentage 27885 1726882534.68635: done checking for max_fail_percentage 27885 1726882534.68636: checking to see if all hosts have failed and the running result is not ok 27885 1726882534.68637: done checking to see if all hosts have failed 27885 1726882534.68638: getting the remaining hosts for this loop 27885 1726882534.68640: done getting the remaining hosts for this loop 27885 1726882534.68643: getting the next task for host managed_node2 27885 1726882534.68650: done getting next task for host managed_node2 27885 1726882534.68652: ^ task is: TASK: Create tap interface {{ interface }} 27885 1726882534.68656: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882534.68660: getting variables 27885 1726882534.68662: in VariableManager get_vars() 27885 1726882534.68704: Calling all_inventory to load vars for managed_node2 27885 1726882534.68707: Calling groups_inventory to load vars for managed_node2 27885 1726882534.68710: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882534.68722: Calling all_plugins_play to load vars for managed_node2 27885 1726882534.68842: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882534.68849: done sending task result for task 12673a56-9f93-3fa5-01be-000000000172 27885 1726882534.68851: WORKER PROCESS EXITING 27885 1726882534.68856: Calling groups_plugins_play to load vars for managed_node2 27885 1726882534.69366: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882534.69584: done with get_vars() 27885 1726882534.69594: done getting variables 27885 1726882534.69659: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27885 1726882534.69760: variable 'interface' from source: set_fact TASK [Create tap interface ethtest0] ******************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Friday 20 September 2024 21:35:34 -0400 (0:00:00.029) 0:00:07.340 ****** 27885 1726882534.69782: entering _queue_task() for managed_node2/command 27885 1726882534.70348: worker is 1 (out of 1 available) 27885 1726882534.70359: exiting _queue_task() for managed_node2/command 27885 1726882534.70369: done queuing things up, now waiting for results queue to drain 27885 1726882534.70370: waiting for pending results... 27885 1726882534.70558: running TaskExecutor() for managed_node2/TASK: Create tap interface ethtest0 27885 1726882534.70780: in run() - task 12673a56-9f93-3fa5-01be-000000000173 27885 1726882534.70998: variable 'ansible_search_path' from source: unknown 27885 1726882534.71001: variable 'ansible_search_path' from source: unknown 27885 1726882534.71004: calling self._execute() 27885 1726882534.71133: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882534.71152: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882534.71166: variable 'omit' from source: magic vars 27885 1726882534.71854: variable 'ansible_distribution_major_version' from source: facts 27885 1726882534.71987: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882534.72418: variable 'type' from source: set_fact 27885 1726882534.72427: variable 'state' from source: include params 27885 1726882534.72434: variable 'interface' from source: set_fact 27885 1726882534.72516: variable 'current_interfaces' from source: set_fact 27885 1726882534.72521: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 27885 1726882534.72523: when evaluation is False, skipping this task 27885 1726882534.72526: _execute() done 27885 1726882534.72528: dumping result to json 27885 1726882534.72530: done dumping result, returning 27885 1726882534.72532: done running TaskExecutor() for managed_node2/TASK: Create tap interface ethtest0 [12673a56-9f93-3fa5-01be-000000000173] 27885 1726882534.72535: sending task result for task 12673a56-9f93-3fa5-01be-000000000173 27885 1726882534.72614: done sending task result for task 12673a56-9f93-3fa5-01be-000000000173 27885 1726882534.72617: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 27885 1726882534.72946: no more pending results, returning what we have 27885 1726882534.72949: results queue empty 27885 1726882534.72950: checking for any_errors_fatal 27885 1726882534.72958: done checking for any_errors_fatal 27885 1726882534.72959: checking for max_fail_percentage 27885 1726882534.72961: done checking for max_fail_percentage 27885 1726882534.72961: checking to see if all hosts have failed and the running result is not ok 27885 1726882534.72962: done checking to see if all hosts have failed 27885 1726882534.72963: getting the remaining hosts for this loop 27885 1726882534.72964: done getting the remaining hosts for this loop 27885 1726882534.72968: getting the next task for host managed_node2 27885 1726882534.72975: done getting next task for host managed_node2 27885 1726882534.72977: ^ task is: TASK: Delete tap interface {{ interface }} 27885 1726882534.72982: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882534.72987: getting variables 27885 1726882534.72988: in VariableManager get_vars() 27885 1726882534.73035: Calling all_inventory to load vars for managed_node2 27885 1726882534.73038: Calling groups_inventory to load vars for managed_node2 27885 1726882534.73041: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882534.73055: Calling all_plugins_play to load vars for managed_node2 27885 1726882534.73058: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882534.73061: Calling groups_plugins_play to load vars for managed_node2 27885 1726882534.73587: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882534.74207: done with get_vars() 27885 1726882534.74218: done getting variables 27885 1726882534.74277: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27885 1726882534.74388: variable 'interface' from source: set_fact TASK [Delete tap interface ethtest0] ******************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Friday 20 September 2024 21:35:34 -0400 (0:00:00.048) 0:00:07.388 ****** 27885 1726882534.74614: entering _queue_task() for managed_node2/command 27885 1726882534.74874: worker is 1 (out of 1 available) 27885 1726882534.74885: exiting _queue_task() for managed_node2/command 27885 1726882534.74899: done queuing things up, now waiting for results queue to drain 27885 1726882534.74900: waiting for pending results... 27885 1726882534.75160: running TaskExecutor() for managed_node2/TASK: Delete tap interface ethtest0 27885 1726882534.75265: in run() - task 12673a56-9f93-3fa5-01be-000000000174 27885 1726882534.75499: variable 'ansible_search_path' from source: unknown 27885 1726882534.75502: variable 'ansible_search_path' from source: unknown 27885 1726882534.75505: calling self._execute() 27885 1726882534.75507: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882534.75510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882534.75512: variable 'omit' from source: magic vars 27885 1726882534.75798: variable 'ansible_distribution_major_version' from source: facts 27885 1726882534.75816: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882534.76103: variable 'type' from source: set_fact 27885 1726882534.76113: variable 'state' from source: include params 27885 1726882534.76121: variable 'interface' from source: set_fact 27885 1726882534.76128: variable 'current_interfaces' from source: set_fact 27885 1726882534.76139: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 27885 1726882534.76145: when evaluation is False, skipping this task 27885 1726882534.76151: _execute() done 27885 1726882534.76158: dumping result to json 27885 1726882534.76167: done dumping result, returning 27885 1726882534.76179: done running TaskExecutor() for managed_node2/TASK: Delete tap interface ethtest0 [12673a56-9f93-3fa5-01be-000000000174] 27885 1726882534.76187: sending task result for task 12673a56-9f93-3fa5-01be-000000000174 skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 27885 1726882534.76325: no more pending results, returning what we have 27885 1726882534.76329: results queue empty 27885 1726882534.76330: checking for any_errors_fatal 27885 1726882534.76338: done checking for any_errors_fatal 27885 1726882534.76339: checking for max_fail_percentage 27885 1726882534.76341: done checking for max_fail_percentage 27885 1726882534.76342: checking to see if all hosts have failed and the running result is not ok 27885 1726882534.76343: done checking to see if all hosts have failed 27885 1726882534.76344: getting the remaining hosts for this loop 27885 1726882534.76345: done getting the remaining hosts for this loop 27885 1726882534.76349: getting the next task for host managed_node2 27885 1726882534.76357: done getting next task for host managed_node2 27885 1726882534.76361: ^ task is: TASK: Assert device is present 27885 1726882534.76364: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882534.76368: getting variables 27885 1726882534.76370: in VariableManager get_vars() 27885 1726882534.76413: Calling all_inventory to load vars for managed_node2 27885 1726882534.76417: Calling groups_inventory to load vars for managed_node2 27885 1726882534.76420: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882534.76433: Calling all_plugins_play to load vars for managed_node2 27885 1726882534.76436: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882534.76439: Calling groups_plugins_play to load vars for managed_node2 27885 1726882534.76910: done sending task result for task 12673a56-9f93-3fa5-01be-000000000174 27885 1726882534.76913: WORKER PROCESS EXITING 27885 1726882534.76936: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882534.77160: done with get_vars() 27885 1726882534.77170: done getting variables TASK [Assert device is present] ************************************************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:21 Friday 20 September 2024 21:35:34 -0400 (0:00:00.026) 0:00:07.415 ****** 27885 1726882534.77255: entering _queue_task() for managed_node2/include_tasks 27885 1726882534.77498: worker is 1 (out of 1 available) 27885 1726882534.77510: exiting _queue_task() for managed_node2/include_tasks 27885 1726882534.77521: done queuing things up, now waiting for results queue to drain 27885 1726882534.77522: waiting for pending results... 27885 1726882534.77766: running TaskExecutor() for managed_node2/TASK: Assert device is present 27885 1726882534.77860: in run() - task 12673a56-9f93-3fa5-01be-00000000000e 27885 1726882534.77884: variable 'ansible_search_path' from source: unknown 27885 1726882534.77924: calling self._execute() 27885 1726882534.78018: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882534.78031: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882534.78044: variable 'omit' from source: magic vars 27885 1726882534.78377: variable 'ansible_distribution_major_version' from source: facts 27885 1726882534.78394: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882534.78406: _execute() done 27885 1726882534.78415: dumping result to json 27885 1726882534.78427: done dumping result, returning 27885 1726882534.78438: done running TaskExecutor() for managed_node2/TASK: Assert device is present [12673a56-9f93-3fa5-01be-00000000000e] 27885 1726882534.78448: sending task result for task 12673a56-9f93-3fa5-01be-00000000000e 27885 1726882534.78620: no more pending results, returning what we have 27885 1726882534.78625: in VariableManager get_vars() 27885 1726882534.78668: Calling all_inventory to load vars for managed_node2 27885 1726882534.78671: Calling groups_inventory to load vars for managed_node2 27885 1726882534.78673: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882534.78685: Calling all_plugins_play to load vars for managed_node2 27885 1726882534.78688: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882534.78691: Calling groups_plugins_play to load vars for managed_node2 27885 1726882534.79284: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882534.79589: done with get_vars() 27885 1726882534.79602: variable 'ansible_search_path' from source: unknown 27885 1726882534.79699: done sending task result for task 12673a56-9f93-3fa5-01be-00000000000e 27885 1726882534.79702: WORKER PROCESS EXITING 27885 1726882534.79709: we have included files to process 27885 1726882534.79710: generating all_blocks data 27885 1726882534.79711: done generating all_blocks data 27885 1726882534.79716: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 27885 1726882534.79717: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 27885 1726882534.79720: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 27885 1726882534.80071: in VariableManager get_vars() 27885 1726882534.80092: done with get_vars() 27885 1726882534.80328: done processing included file 27885 1726882534.80330: iterating over new_blocks loaded from include file 27885 1726882534.80331: in VariableManager get_vars() 27885 1726882534.80347: done with get_vars() 27885 1726882534.80349: filtering new block on tags 27885 1726882534.80365: done filtering new block on tags 27885 1726882534.80367: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node2 27885 1726882534.80372: extending task lists for all hosts with included blocks 27885 1726882534.81566: done extending task lists 27885 1726882534.81568: done processing included files 27885 1726882534.81568: results queue empty 27885 1726882534.81569: checking for any_errors_fatal 27885 1726882534.81572: done checking for any_errors_fatal 27885 1726882534.81572: checking for max_fail_percentage 27885 1726882534.81573: done checking for max_fail_percentage 27885 1726882534.81574: checking to see if all hosts have failed and the running result is not ok 27885 1726882534.81575: done checking to see if all hosts have failed 27885 1726882534.81575: getting the remaining hosts for this loop 27885 1726882534.81576: done getting the remaining hosts for this loop 27885 1726882534.81579: getting the next task for host managed_node2 27885 1726882534.81582: done getting next task for host managed_node2 27885 1726882534.81584: ^ task is: TASK: Include the task 'get_interface_stat.yml' 27885 1726882534.81586: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882534.81589: getting variables 27885 1726882534.81590: in VariableManager get_vars() 27885 1726882534.81604: Calling all_inventory to load vars for managed_node2 27885 1726882534.81606: Calling groups_inventory to load vars for managed_node2 27885 1726882534.81608: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882534.81612: Calling all_plugins_play to load vars for managed_node2 27885 1726882534.81614: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882534.81617: Calling groups_plugins_play to load vars for managed_node2 27885 1726882534.81753: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882534.82466: done with get_vars() 27885 1726882534.82475: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:35:34 -0400 (0:00:00.055) 0:00:07.471 ****** 27885 1726882534.82857: entering _queue_task() for managed_node2/include_tasks 27885 1726882534.83245: worker is 1 (out of 1 available) 27885 1726882534.83257: exiting _queue_task() for managed_node2/include_tasks 27885 1726882534.83269: done queuing things up, now waiting for results queue to drain 27885 1726882534.83271: waiting for pending results... 27885 1726882534.83922: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 27885 1726882534.84031: in run() - task 12673a56-9f93-3fa5-01be-000000000214 27885 1726882534.84166: variable 'ansible_search_path' from source: unknown 27885 1726882534.84175: variable 'ansible_search_path' from source: unknown 27885 1726882534.84218: calling self._execute() 27885 1726882534.84335: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882534.84425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882534.84441: variable 'omit' from source: magic vars 27885 1726882534.84797: variable 'ansible_distribution_major_version' from source: facts 27885 1726882534.84926: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882534.84938: _execute() done 27885 1726882534.85085: dumping result to json 27885 1726882534.85099: done dumping result, returning 27885 1726882534.85110: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [12673a56-9f93-3fa5-01be-000000000214] 27885 1726882534.85120: sending task result for task 12673a56-9f93-3fa5-01be-000000000214 27885 1726882534.85436: no more pending results, returning what we have 27885 1726882534.85441: in VariableManager get_vars() 27885 1726882534.85486: Calling all_inventory to load vars for managed_node2 27885 1726882534.85489: Calling groups_inventory to load vars for managed_node2 27885 1726882534.85492: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882534.85508: Calling all_plugins_play to load vars for managed_node2 27885 1726882534.85512: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882534.85515: Calling groups_plugins_play to load vars for managed_node2 27885 1726882534.86318: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882534.86529: done with get_vars() 27885 1726882534.86538: variable 'ansible_search_path' from source: unknown 27885 1726882534.86539: variable 'ansible_search_path' from source: unknown 27885 1726882534.86822: we have included files to process 27885 1726882534.86824: generating all_blocks data 27885 1726882534.86825: done generating all_blocks data 27885 1726882534.86827: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 27885 1726882534.86828: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 27885 1726882534.86830: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 27885 1726882534.87206: done sending task result for task 12673a56-9f93-3fa5-01be-000000000214 27885 1726882534.87210: WORKER PROCESS EXITING 27885 1726882534.87273: done processing included file 27885 1726882534.87275: iterating over new_blocks loaded from include file 27885 1726882534.87277: in VariableManager get_vars() 27885 1726882534.87296: done with get_vars() 27885 1726882534.87298: filtering new block on tags 27885 1726882534.87313: done filtering new block on tags 27885 1726882534.87315: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 27885 1726882534.87319: extending task lists for all hosts with included blocks 27885 1726882534.87555: done extending task lists 27885 1726882534.87557: done processing included files 27885 1726882534.87558: results queue empty 27885 1726882534.87558: checking for any_errors_fatal 27885 1726882534.87562: done checking for any_errors_fatal 27885 1726882534.87562: checking for max_fail_percentage 27885 1726882534.87564: done checking for max_fail_percentage 27885 1726882534.87564: checking to see if all hosts have failed and the running result is not ok 27885 1726882534.87565: done checking to see if all hosts have failed 27885 1726882534.87566: getting the remaining hosts for this loop 27885 1726882534.87567: done getting the remaining hosts for this loop 27885 1726882534.87569: getting the next task for host managed_node2 27885 1726882534.87573: done getting next task for host managed_node2 27885 1726882534.87575: ^ task is: TASK: Get stat for interface {{ interface }} 27885 1726882534.87579: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882534.87582: getting variables 27885 1726882534.87582: in VariableManager get_vars() 27885 1726882534.87596: Calling all_inventory to load vars for managed_node2 27885 1726882534.87598: Calling groups_inventory to load vars for managed_node2 27885 1726882534.87601: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882534.87605: Calling all_plugins_play to load vars for managed_node2 27885 1726882534.87607: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882534.87610: Calling groups_plugins_play to load vars for managed_node2 27885 1726882534.87791: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882534.88163: done with get_vars() 27885 1726882534.88172: done getting variables 27885 1726882534.88321: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest0] ***************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:35:34 -0400 (0:00:00.054) 0:00:07.525 ****** 27885 1726882534.88351: entering _queue_task() for managed_node2/stat 27885 1726882534.88599: worker is 1 (out of 1 available) 27885 1726882534.88609: exiting _queue_task() for managed_node2/stat 27885 1726882534.88619: done queuing things up, now waiting for results queue to drain 27885 1726882534.88620: waiting for pending results... 27885 1726882534.88850: running TaskExecutor() for managed_node2/TASK: Get stat for interface ethtest0 27885 1726882534.88958: in run() - task 12673a56-9f93-3fa5-01be-000000000267 27885 1726882534.88981: variable 'ansible_search_path' from source: unknown 27885 1726882534.88988: variable 'ansible_search_path' from source: unknown 27885 1726882534.89030: calling self._execute() 27885 1726882534.89118: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882534.89129: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882534.89142: variable 'omit' from source: magic vars 27885 1726882534.89478: variable 'ansible_distribution_major_version' from source: facts 27885 1726882534.89497: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882534.89513: variable 'omit' from source: magic vars 27885 1726882534.89558: variable 'omit' from source: magic vars 27885 1726882534.89659: variable 'interface' from source: set_fact 27885 1726882534.89680: variable 'omit' from source: magic vars 27885 1726882534.89726: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882534.89764: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882534.89784: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882534.89807: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882534.89825: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882534.89863: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882534.89873: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882534.89881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882534.89980: Set connection var ansible_pipelining to False 27885 1726882534.89990: Set connection var ansible_connection to ssh 27885 1726882534.90060: Set connection var ansible_timeout to 10 27885 1726882534.90067: Set connection var ansible_shell_type to sh 27885 1726882534.90075: Set connection var ansible_shell_executable to /bin/sh 27885 1726882534.90084: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882534.90111: variable 'ansible_shell_executable' from source: unknown 27885 1726882534.90118: variable 'ansible_connection' from source: unknown 27885 1726882534.90125: variable 'ansible_module_compression' from source: unknown 27885 1726882534.90164: variable 'ansible_shell_type' from source: unknown 27885 1726882534.90171: variable 'ansible_shell_executable' from source: unknown 27885 1726882534.90177: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882534.90183: variable 'ansible_pipelining' from source: unknown 27885 1726882534.90189: variable 'ansible_timeout' from source: unknown 27885 1726882534.90199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882534.90617: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 27885 1726882534.90631: variable 'omit' from source: magic vars 27885 1726882534.90641: starting attempt loop 27885 1726882534.90646: running the handler 27885 1726882534.90663: _low_level_execute_command(): starting 27885 1726882534.90709: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27885 1726882534.92087: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882534.92418: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882534.92512: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882534.94239: stdout chunk (state=3): >>>/root <<< 27885 1726882534.94286: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882534.94296: stdout chunk (state=3): >>><<< 27885 1726882534.94361: stderr chunk (state=3): >>><<< 27885 1726882534.94365: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882534.94368: _low_level_execute_command(): starting 27885 1726882534.94371: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882534.9432724-28250-135989956058185 `" && echo ansible-tmp-1726882534.9432724-28250-135989956058185="` echo /root/.ansible/tmp/ansible-tmp-1726882534.9432724-28250-135989956058185 `" ) && sleep 0' 27885 1726882534.95498: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882534.95707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882534.95828: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882534.95881: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882534.97757: stdout chunk (state=3): >>>ansible-tmp-1726882534.9432724-28250-135989956058185=/root/.ansible/tmp/ansible-tmp-1726882534.9432724-28250-135989956058185 <<< 27885 1726882534.97859: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882534.97908: stderr chunk (state=3): >>><<< 27885 1726882534.97912: stdout chunk (state=3): >>><<< 27885 1726882534.97931: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882534.9432724-28250-135989956058185=/root/.ansible/tmp/ansible-tmp-1726882534.9432724-28250-135989956058185 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882534.97979: variable 'ansible_module_compression' from source: unknown 27885 1726882534.98042: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27885_1t3g5hz/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 27885 1726882534.98098: variable 'ansible_facts' from source: unknown 27885 1726882534.98383: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882534.9432724-28250-135989956058185/AnsiballZ_stat.py 27885 1726882534.98751: Sending initial data 27885 1726882534.98754: Sent initial data (153 bytes) 27885 1726882535.00282: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882535.00286: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882535.00289: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882535.00561: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882535.02034: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 27885 1726882535.02038: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27885 1726882535.02086: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27885 1726882535.02148: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpoz5uat8d /root/.ansible/tmp/ansible-tmp-1726882534.9432724-28250-135989956058185/AnsiballZ_stat.py <<< 27885 1726882535.02151: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882534.9432724-28250-135989956058185/AnsiballZ_stat.py" <<< 27885 1726882535.02207: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpoz5uat8d" to remote "/root/.ansible/tmp/ansible-tmp-1726882534.9432724-28250-135989956058185/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882534.9432724-28250-135989956058185/AnsiballZ_stat.py" <<< 27885 1726882535.03099: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882535.03102: stderr chunk (state=3): >>><<< 27885 1726882535.03113: stdout chunk (state=3): >>><<< 27885 1726882535.03216: done transferring module to remote 27885 1726882535.03221: _low_level_execute_command(): starting 27885 1726882535.03223: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882534.9432724-28250-135989956058185/ /root/.ansible/tmp/ansible-tmp-1726882534.9432724-28250-135989956058185/AnsiballZ_stat.py && sleep 0' 27885 1726882535.03736: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882535.03751: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882535.03769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882535.03791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882535.03886: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882535.03926: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882535.04030: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882535.05745: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882535.05751: stderr chunk (state=3): >>><<< 27885 1726882535.05753: stdout chunk (state=3): >>><<< 27885 1726882535.05767: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882535.05774: _low_level_execute_command(): starting 27885 1726882535.05776: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882534.9432724-28250-135989956058185/AnsiballZ_stat.py && sleep 0' 27885 1726882535.06644: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882535.06649: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 27885 1726882535.06652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882535.06744: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882535.06753: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882535.06888: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882535.21688: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 30009, "dev": 23, "nlink": 1, "atime": 1726882533.383787, "mtime": 1726882533.383787, "ctime": 1726882533.383787, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 27885 1726882535.22815: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 27885 1726882535.22841: stderr chunk (state=3): >>><<< 27885 1726882535.22844: stdout chunk (state=3): >>><<< 27885 1726882535.22861: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 30009, "dev": 23, "nlink": 1, "atime": 1726882533.383787, "mtime": 1726882533.383787, "ctime": 1726882533.383787, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 27885 1726882535.22905: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882534.9432724-28250-135989956058185/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27885 1726882535.22913: _low_level_execute_command(): starting 27885 1726882535.22918: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882534.9432724-28250-135989956058185/ > /dev/null 2>&1 && sleep 0' 27885 1726882535.23339: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882535.23343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 27885 1726882535.23345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882535.23347: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882535.23350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882535.23399: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882535.23407: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882535.23466: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882535.25233: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882535.25259: stderr chunk (state=3): >>><<< 27885 1726882535.25262: stdout chunk (state=3): >>><<< 27885 1726882535.25273: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882535.25283: handler run complete 27885 1726882535.25314: attempt loop complete, returning result 27885 1726882535.25317: _execute() done 27885 1726882535.25319: dumping result to json 27885 1726882535.25325: done dumping result, returning 27885 1726882535.25332: done running TaskExecutor() for managed_node2/TASK: Get stat for interface ethtest0 [12673a56-9f93-3fa5-01be-000000000267] 27885 1726882535.25337: sending task result for task 12673a56-9f93-3fa5-01be-000000000267 27885 1726882535.25443: done sending task result for task 12673a56-9f93-3fa5-01be-000000000267 27885 1726882535.25445: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "atime": 1726882533.383787, "block_size": 4096, "blocks": 0, "ctime": 1726882533.383787, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 30009, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "mode": "0777", "mtime": 1726882533.383787, "nlink": 1, "path": "/sys/class/net/ethtest0", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 27885 1726882535.25543: no more pending results, returning what we have 27885 1726882535.25546: results queue empty 27885 1726882535.25547: checking for any_errors_fatal 27885 1726882535.25548: done checking for any_errors_fatal 27885 1726882535.25549: checking for max_fail_percentage 27885 1726882535.25550: done checking for max_fail_percentage 27885 1726882535.25551: checking to see if all hosts have failed and the running result is not ok 27885 1726882535.25552: done checking to see if all hosts have failed 27885 1726882535.25552: getting the remaining hosts for this loop 27885 1726882535.25554: done getting the remaining hosts for this loop 27885 1726882535.25557: getting the next task for host managed_node2 27885 1726882535.25565: done getting next task for host managed_node2 27885 1726882535.25567: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 27885 1726882535.25570: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882535.25574: getting variables 27885 1726882535.25575: in VariableManager get_vars() 27885 1726882535.25612: Calling all_inventory to load vars for managed_node2 27885 1726882535.25614: Calling groups_inventory to load vars for managed_node2 27885 1726882535.25617: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882535.25626: Calling all_plugins_play to load vars for managed_node2 27885 1726882535.25628: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882535.25630: Calling groups_plugins_play to load vars for managed_node2 27885 1726882535.25767: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882535.25926: done with get_vars() 27885 1726882535.25934: done getting variables 27885 1726882535.26006: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 27885 1726882535.26092: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'ethtest0'] *********************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:35:35 -0400 (0:00:00.377) 0:00:07.903 ****** 27885 1726882535.26116: entering _queue_task() for managed_node2/assert 27885 1726882535.26117: Creating lock for assert 27885 1726882535.26312: worker is 1 (out of 1 available) 27885 1726882535.26323: exiting _queue_task() for managed_node2/assert 27885 1726882535.26334: done queuing things up, now waiting for results queue to drain 27885 1726882535.26335: waiting for pending results... 27885 1726882535.26487: running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'ethtest0' 27885 1726882535.26548: in run() - task 12673a56-9f93-3fa5-01be-000000000215 27885 1726882535.26563: variable 'ansible_search_path' from source: unknown 27885 1726882535.26566: variable 'ansible_search_path' from source: unknown 27885 1726882535.26595: calling self._execute() 27885 1726882535.26655: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882535.26659: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882535.26668: variable 'omit' from source: magic vars 27885 1726882535.26939: variable 'ansible_distribution_major_version' from source: facts 27885 1726882535.26948: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882535.26954: variable 'omit' from source: magic vars 27885 1726882535.26978: variable 'omit' from source: magic vars 27885 1726882535.27051: variable 'interface' from source: set_fact 27885 1726882535.27065: variable 'omit' from source: magic vars 27885 1726882535.27100: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882535.27127: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882535.27142: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882535.27154: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882535.27165: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882535.27189: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882535.27197: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882535.27200: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882535.27269: Set connection var ansible_pipelining to False 27885 1726882535.27272: Set connection var ansible_connection to ssh 27885 1726882535.27278: Set connection var ansible_timeout to 10 27885 1726882535.27280: Set connection var ansible_shell_type to sh 27885 1726882535.27285: Set connection var ansible_shell_executable to /bin/sh 27885 1726882535.27290: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882535.27315: variable 'ansible_shell_executable' from source: unknown 27885 1726882535.27319: variable 'ansible_connection' from source: unknown 27885 1726882535.27321: variable 'ansible_module_compression' from source: unknown 27885 1726882535.27323: variable 'ansible_shell_type' from source: unknown 27885 1726882535.27326: variable 'ansible_shell_executable' from source: unknown 27885 1726882535.27329: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882535.27332: variable 'ansible_pipelining' from source: unknown 27885 1726882535.27334: variable 'ansible_timeout' from source: unknown 27885 1726882535.27336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882535.27433: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882535.27440: variable 'omit' from source: magic vars 27885 1726882535.27446: starting attempt loop 27885 1726882535.27449: running the handler 27885 1726882535.27535: variable 'interface_stat' from source: set_fact 27885 1726882535.27549: Evaluated conditional (interface_stat.stat.exists): True 27885 1726882535.27554: handler run complete 27885 1726882535.27567: attempt loop complete, returning result 27885 1726882535.27570: _execute() done 27885 1726882535.27572: dumping result to json 27885 1726882535.27575: done dumping result, returning 27885 1726882535.27580: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'ethtest0' [12673a56-9f93-3fa5-01be-000000000215] 27885 1726882535.27585: sending task result for task 12673a56-9f93-3fa5-01be-000000000215 27885 1726882535.27666: done sending task result for task 12673a56-9f93-3fa5-01be-000000000215 27885 1726882535.27670: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 27885 1726882535.27714: no more pending results, returning what we have 27885 1726882535.27717: results queue empty 27885 1726882535.27718: checking for any_errors_fatal 27885 1726882535.27724: done checking for any_errors_fatal 27885 1726882535.27725: checking for max_fail_percentage 27885 1726882535.27726: done checking for max_fail_percentage 27885 1726882535.27727: checking to see if all hosts have failed and the running result is not ok 27885 1726882535.27728: done checking to see if all hosts have failed 27885 1726882535.27729: getting the remaining hosts for this loop 27885 1726882535.27730: done getting the remaining hosts for this loop 27885 1726882535.27733: getting the next task for host managed_node2 27885 1726882535.27739: done getting next task for host managed_node2 27885 1726882535.27742: ^ task is: TASK: Set interface1 27885 1726882535.27744: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882535.27747: getting variables 27885 1726882535.27748: in VariableManager get_vars() 27885 1726882535.27779: Calling all_inventory to load vars for managed_node2 27885 1726882535.27782: Calling groups_inventory to load vars for managed_node2 27885 1726882535.27784: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882535.27792: Calling all_plugins_play to load vars for managed_node2 27885 1726882535.27796: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882535.27799: Calling groups_plugins_play to load vars for managed_node2 27885 1726882535.27923: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882535.28049: done with get_vars() 27885 1726882535.28057: done getting variables 27885 1726882535.28096: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set interface1] ********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:23 Friday 20 September 2024 21:35:35 -0400 (0:00:00.019) 0:00:07.923 ****** 27885 1726882535.28114: entering _queue_task() for managed_node2/set_fact 27885 1726882535.28282: worker is 1 (out of 1 available) 27885 1726882535.28296: exiting _queue_task() for managed_node2/set_fact 27885 1726882535.28310: done queuing things up, now waiting for results queue to drain 27885 1726882535.28311: waiting for pending results... 27885 1726882535.28449: running TaskExecutor() for managed_node2/TASK: Set interface1 27885 1726882535.28498: in run() - task 12673a56-9f93-3fa5-01be-00000000000f 27885 1726882535.28510: variable 'ansible_search_path' from source: unknown 27885 1726882535.28542: calling self._execute() 27885 1726882535.28597: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882535.28604: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882535.28612: variable 'omit' from source: magic vars 27885 1726882535.28844: variable 'ansible_distribution_major_version' from source: facts 27885 1726882535.28851: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882535.28859: variable 'omit' from source: magic vars 27885 1726882535.28877: variable 'omit' from source: magic vars 27885 1726882535.28928: variable 'interface1' from source: play vars 27885 1726882535.28965: variable 'interface1' from source: play vars 27885 1726882535.28980: variable 'omit' from source: magic vars 27885 1726882535.29014: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882535.29037: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882535.29052: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882535.29064: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882535.29073: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882535.29101: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882535.29104: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882535.29106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882535.29168: Set connection var ansible_pipelining to False 27885 1726882535.29171: Set connection var ansible_connection to ssh 27885 1726882535.29176: Set connection var ansible_timeout to 10 27885 1726882535.29179: Set connection var ansible_shell_type to sh 27885 1726882535.29184: Set connection var ansible_shell_executable to /bin/sh 27885 1726882535.29189: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882535.29217: variable 'ansible_shell_executable' from source: unknown 27885 1726882535.29221: variable 'ansible_connection' from source: unknown 27885 1726882535.29223: variable 'ansible_module_compression' from source: unknown 27885 1726882535.29226: variable 'ansible_shell_type' from source: unknown 27885 1726882535.29228: variable 'ansible_shell_executable' from source: unknown 27885 1726882535.29230: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882535.29232: variable 'ansible_pipelining' from source: unknown 27885 1726882535.29234: variable 'ansible_timeout' from source: unknown 27885 1726882535.29237: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882535.29336: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882535.29344: variable 'omit' from source: magic vars 27885 1726882535.29349: starting attempt loop 27885 1726882535.29352: running the handler 27885 1726882535.29361: handler run complete 27885 1726882535.29368: attempt loop complete, returning result 27885 1726882535.29371: _execute() done 27885 1726882535.29373: dumping result to json 27885 1726882535.29377: done dumping result, returning 27885 1726882535.29383: done running TaskExecutor() for managed_node2/TASK: Set interface1 [12673a56-9f93-3fa5-01be-00000000000f] 27885 1726882535.29387: sending task result for task 12673a56-9f93-3fa5-01be-00000000000f 27885 1726882535.29466: done sending task result for task 12673a56-9f93-3fa5-01be-00000000000f 27885 1726882535.29469: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "interface": "ethtest1" }, "changed": false } 27885 1726882535.29519: no more pending results, returning what we have 27885 1726882535.29521: results queue empty 27885 1726882535.29522: checking for any_errors_fatal 27885 1726882535.29526: done checking for any_errors_fatal 27885 1726882535.29527: checking for max_fail_percentage 27885 1726882535.29528: done checking for max_fail_percentage 27885 1726882535.29529: checking to see if all hosts have failed and the running result is not ok 27885 1726882535.29530: done checking to see if all hosts have failed 27885 1726882535.29530: getting the remaining hosts for this loop 27885 1726882535.29531: done getting the remaining hosts for this loop 27885 1726882535.29534: getting the next task for host managed_node2 27885 1726882535.29538: done getting next task for host managed_node2 27885 1726882535.29540: ^ task is: TASK: Show interfaces 27885 1726882535.29542: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882535.29545: getting variables 27885 1726882535.29546: in VariableManager get_vars() 27885 1726882535.29575: Calling all_inventory to load vars for managed_node2 27885 1726882535.29578: Calling groups_inventory to load vars for managed_node2 27885 1726882535.29580: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882535.29587: Calling all_plugins_play to load vars for managed_node2 27885 1726882535.29589: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882535.29592: Calling groups_plugins_play to load vars for managed_node2 27885 1726882535.29736: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882535.29856: done with get_vars() 27885 1726882535.29863: done getting variables TASK [Show interfaces] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:26 Friday 20 September 2024 21:35:35 -0400 (0:00:00.018) 0:00:07.941 ****** 27885 1726882535.29919: entering _queue_task() for managed_node2/include_tasks 27885 1726882535.30082: worker is 1 (out of 1 available) 27885 1726882535.30097: exiting _queue_task() for managed_node2/include_tasks 27885 1726882535.30111: done queuing things up, now waiting for results queue to drain 27885 1726882535.30112: waiting for pending results... 27885 1726882535.30250: running TaskExecutor() for managed_node2/TASK: Show interfaces 27885 1726882535.30301: in run() - task 12673a56-9f93-3fa5-01be-000000000010 27885 1726882535.30313: variable 'ansible_search_path' from source: unknown 27885 1726882535.30339: calling self._execute() 27885 1726882535.30598: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882535.30601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882535.30605: variable 'omit' from source: magic vars 27885 1726882535.30764: variable 'ansible_distribution_major_version' from source: facts 27885 1726882535.30780: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882535.30795: _execute() done 27885 1726882535.30805: dumping result to json 27885 1726882535.30813: done dumping result, returning 27885 1726882535.30825: done running TaskExecutor() for managed_node2/TASK: Show interfaces [12673a56-9f93-3fa5-01be-000000000010] 27885 1726882535.30834: sending task result for task 12673a56-9f93-3fa5-01be-000000000010 27885 1726882535.30948: no more pending results, returning what we have 27885 1726882535.30953: in VariableManager get_vars() 27885 1726882535.31113: Calling all_inventory to load vars for managed_node2 27885 1726882535.31117: Calling groups_inventory to load vars for managed_node2 27885 1726882535.31119: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882535.31130: done sending task result for task 12673a56-9f93-3fa5-01be-000000000010 27885 1726882535.31133: WORKER PROCESS EXITING 27885 1726882535.31141: Calling all_plugins_play to load vars for managed_node2 27885 1726882535.31144: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882535.31147: Calling groups_plugins_play to load vars for managed_node2 27885 1726882535.31398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882535.31546: done with get_vars() 27885 1726882535.31551: variable 'ansible_search_path' from source: unknown 27885 1726882535.31561: we have included files to process 27885 1726882535.31562: generating all_blocks data 27885 1726882535.31564: done generating all_blocks data 27885 1726882535.31569: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 27885 1726882535.31570: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 27885 1726882535.31573: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 27885 1726882535.31641: in VariableManager get_vars() 27885 1726882535.31654: done with get_vars() 27885 1726882535.31727: done processing included file 27885 1726882535.31729: iterating over new_blocks loaded from include file 27885 1726882535.31730: in VariableManager get_vars() 27885 1726882535.31740: done with get_vars() 27885 1726882535.31741: filtering new block on tags 27885 1726882535.31751: done filtering new block on tags 27885 1726882535.31752: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 27885 1726882535.31756: extending task lists for all hosts with included blocks 27885 1726882535.32171: done extending task lists 27885 1726882535.32173: done processing included files 27885 1726882535.32173: results queue empty 27885 1726882535.32173: checking for any_errors_fatal 27885 1726882535.32175: done checking for any_errors_fatal 27885 1726882535.32176: checking for max_fail_percentage 27885 1726882535.32176: done checking for max_fail_percentage 27885 1726882535.32177: checking to see if all hosts have failed and the running result is not ok 27885 1726882535.32177: done checking to see if all hosts have failed 27885 1726882535.32178: getting the remaining hosts for this loop 27885 1726882535.32178: done getting the remaining hosts for this loop 27885 1726882535.32180: getting the next task for host managed_node2 27885 1726882535.32182: done getting next task for host managed_node2 27885 1726882535.32183: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 27885 1726882535.32185: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882535.32187: getting variables 27885 1726882535.32187: in VariableManager get_vars() 27885 1726882535.32200: Calling all_inventory to load vars for managed_node2 27885 1726882535.32201: Calling groups_inventory to load vars for managed_node2 27885 1726882535.32202: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882535.32206: Calling all_plugins_play to load vars for managed_node2 27885 1726882535.32207: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882535.32209: Calling groups_plugins_play to load vars for managed_node2 27885 1726882535.32297: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882535.32413: done with get_vars() 27885 1726882535.32421: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:35:35 -0400 (0:00:00.025) 0:00:07.967 ****** 27885 1726882535.32468: entering _queue_task() for managed_node2/include_tasks 27885 1726882535.32642: worker is 1 (out of 1 available) 27885 1726882535.32654: exiting _queue_task() for managed_node2/include_tasks 27885 1726882535.32665: done queuing things up, now waiting for results queue to drain 27885 1726882535.32667: waiting for pending results... 27885 1726882535.32818: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 27885 1726882535.32874: in run() - task 12673a56-9f93-3fa5-01be-000000000282 27885 1726882535.32885: variable 'ansible_search_path' from source: unknown 27885 1726882535.32892: variable 'ansible_search_path' from source: unknown 27885 1726882535.32921: calling self._execute() 27885 1726882535.32981: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882535.32984: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882535.32997: variable 'omit' from source: magic vars 27885 1726882535.33313: variable 'ansible_distribution_major_version' from source: facts 27885 1726882535.33317: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882535.33333: _execute() done 27885 1726882535.33358: dumping result to json 27885 1726882535.33361: done dumping result, returning 27885 1726882535.33414: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [12673a56-9f93-3fa5-01be-000000000282] 27885 1726882535.33417: sending task result for task 12673a56-9f93-3fa5-01be-000000000282 27885 1726882535.33521: no more pending results, returning what we have 27885 1726882535.33525: in VariableManager get_vars() 27885 1726882535.33567: Calling all_inventory to load vars for managed_node2 27885 1726882535.33569: Calling groups_inventory to load vars for managed_node2 27885 1726882535.33572: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882535.33585: Calling all_plugins_play to load vars for managed_node2 27885 1726882535.33588: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882535.33595: Calling groups_plugins_play to load vars for managed_node2 27885 1726882535.34031: done sending task result for task 12673a56-9f93-3fa5-01be-000000000282 27885 1726882535.34034: WORKER PROCESS EXITING 27885 1726882535.34049: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882535.34254: done with get_vars() 27885 1726882535.34262: variable 'ansible_search_path' from source: unknown 27885 1726882535.34263: variable 'ansible_search_path' from source: unknown 27885 1726882535.34297: we have included files to process 27885 1726882535.34299: generating all_blocks data 27885 1726882535.34300: done generating all_blocks data 27885 1726882535.34301: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 27885 1726882535.34302: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 27885 1726882535.34304: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 27885 1726882535.34538: done processing included file 27885 1726882535.34540: iterating over new_blocks loaded from include file 27885 1726882535.34541: in VariableManager get_vars() 27885 1726882535.34558: done with get_vars() 27885 1726882535.34560: filtering new block on tags 27885 1726882535.34576: done filtering new block on tags 27885 1726882535.34578: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 27885 1726882535.34582: extending task lists for all hosts with included blocks 27885 1726882535.34666: done extending task lists 27885 1726882535.34668: done processing included files 27885 1726882535.34668: results queue empty 27885 1726882535.34669: checking for any_errors_fatal 27885 1726882535.34672: done checking for any_errors_fatal 27885 1726882535.34673: checking for max_fail_percentage 27885 1726882535.34673: done checking for max_fail_percentage 27885 1726882535.34674: checking to see if all hosts have failed and the running result is not ok 27885 1726882535.34675: done checking to see if all hosts have failed 27885 1726882535.34675: getting the remaining hosts for this loop 27885 1726882535.34676: done getting the remaining hosts for this loop 27885 1726882535.34678: getting the next task for host managed_node2 27885 1726882535.34682: done getting next task for host managed_node2 27885 1726882535.34684: ^ task is: TASK: Gather current interface info 27885 1726882535.34686: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882535.34689: getting variables 27885 1726882535.34689: in VariableManager get_vars() 27885 1726882535.34702: Calling all_inventory to load vars for managed_node2 27885 1726882535.34704: Calling groups_inventory to load vars for managed_node2 27885 1726882535.34706: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882535.34710: Calling all_plugins_play to load vars for managed_node2 27885 1726882535.34711: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882535.34714: Calling groups_plugins_play to load vars for managed_node2 27885 1726882535.34862: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882535.35089: done with get_vars() 27885 1726882535.35099: done getting variables 27885 1726882535.35136: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:35:35 -0400 (0:00:00.026) 0:00:07.994 ****** 27885 1726882535.35162: entering _queue_task() for managed_node2/command 27885 1726882535.35382: worker is 1 (out of 1 available) 27885 1726882535.35597: exiting _queue_task() for managed_node2/command 27885 1726882535.35608: done queuing things up, now waiting for results queue to drain 27885 1726882535.35610: waiting for pending results... 27885 1726882535.35737: running TaskExecutor() for managed_node2/TASK: Gather current interface info 27885 1726882535.35770: in run() - task 12673a56-9f93-3fa5-01be-0000000002e0 27885 1726882535.35791: variable 'ansible_search_path' from source: unknown 27885 1726882535.35802: variable 'ansible_search_path' from source: unknown 27885 1726882535.35845: calling self._execute() 27885 1726882535.35926: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882535.35940: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882535.35957: variable 'omit' from source: magic vars 27885 1726882535.36313: variable 'ansible_distribution_major_version' from source: facts 27885 1726882535.36331: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882535.36342: variable 'omit' from source: magic vars 27885 1726882535.36392: variable 'omit' from source: magic vars 27885 1726882535.36495: variable 'omit' from source: magic vars 27885 1726882535.36498: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882535.36514: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882535.36539: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882535.36561: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882535.36577: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882535.36617: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882535.36628: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882535.36636: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882535.36743: Set connection var ansible_pipelining to False 27885 1726882535.36754: Set connection var ansible_connection to ssh 27885 1726882535.36764: Set connection var ansible_timeout to 10 27885 1726882535.36771: Set connection var ansible_shell_type to sh 27885 1726882535.36817: Set connection var ansible_shell_executable to /bin/sh 27885 1726882535.36820: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882535.36823: variable 'ansible_shell_executable' from source: unknown 27885 1726882535.36827: variable 'ansible_connection' from source: unknown 27885 1726882535.36835: variable 'ansible_module_compression' from source: unknown 27885 1726882535.36842: variable 'ansible_shell_type' from source: unknown 27885 1726882535.36849: variable 'ansible_shell_executable' from source: unknown 27885 1726882535.36854: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882535.36861: variable 'ansible_pipelining' from source: unknown 27885 1726882535.36925: variable 'ansible_timeout' from source: unknown 27885 1726882535.36928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882535.37014: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882535.37037: variable 'omit' from source: magic vars 27885 1726882535.37049: starting attempt loop 27885 1726882535.37055: running the handler 27885 1726882535.37077: _low_level_execute_command(): starting 27885 1726882535.37088: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27885 1726882535.37811: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 27885 1726882535.37905: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882535.37927: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882535.37946: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882535.38038: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882535.39611: stdout chunk (state=3): >>>/root <<< 27885 1726882535.39766: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882535.39770: stdout chunk (state=3): >>><<< 27885 1726882535.39772: stderr chunk (state=3): >>><<< 27885 1726882535.39797: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882535.39888: _low_level_execute_command(): starting 27885 1726882535.39892: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882535.398041-28280-139948539756409 `" && echo ansible-tmp-1726882535.398041-28280-139948539756409="` echo /root/.ansible/tmp/ansible-tmp-1726882535.398041-28280-139948539756409 `" ) && sleep 0' 27885 1726882535.40510: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882535.40565: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882535.40588: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882535.40617: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882535.40712: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882535.42553: stdout chunk (state=3): >>>ansible-tmp-1726882535.398041-28280-139948539756409=/root/.ansible/tmp/ansible-tmp-1726882535.398041-28280-139948539756409 <<< 27885 1726882535.42717: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882535.42721: stdout chunk (state=3): >>><<< 27885 1726882535.42723: stderr chunk (state=3): >>><<< 27885 1726882535.42745: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882535.398041-28280-139948539756409=/root/.ansible/tmp/ansible-tmp-1726882535.398041-28280-139948539756409 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882535.42898: variable 'ansible_module_compression' from source: unknown 27885 1726882535.42901: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27885_1t3g5hz/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27885 1726882535.42904: variable 'ansible_facts' from source: unknown 27885 1726882535.42975: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882535.398041-28280-139948539756409/AnsiballZ_command.py 27885 1726882535.43147: Sending initial data 27885 1726882535.43156: Sent initial data (155 bytes) 27885 1726882535.43818: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882535.43908: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882535.43943: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882535.43959: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882535.43979: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882535.44073: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882535.45596: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27885 1726882535.45650: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27885 1726882535.45722: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpw5hv0q16 /root/.ansible/tmp/ansible-tmp-1726882535.398041-28280-139948539756409/AnsiballZ_command.py <<< 27885 1726882535.45725: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882535.398041-28280-139948539756409/AnsiballZ_command.py" <<< 27885 1726882535.45776: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpw5hv0q16" to remote "/root/.ansible/tmp/ansible-tmp-1726882535.398041-28280-139948539756409/AnsiballZ_command.py" <<< 27885 1726882535.45779: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882535.398041-28280-139948539756409/AnsiballZ_command.py" <<< 27885 1726882535.46454: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882535.46458: stderr chunk (state=3): >>><<< 27885 1726882535.46481: stdout chunk (state=3): >>><<< 27885 1726882535.46506: done transferring module to remote 27885 1726882535.46518: _low_level_execute_command(): starting 27885 1726882535.46607: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882535.398041-28280-139948539756409/ /root/.ansible/tmp/ansible-tmp-1726882535.398041-28280-139948539756409/AnsiballZ_command.py && sleep 0' 27885 1726882535.47187: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882535.47244: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882535.47248: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882535.47309: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882535.49003: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882535.49035: stderr chunk (state=3): >>><<< 27885 1726882535.49042: stdout chunk (state=3): >>><<< 27885 1726882535.49056: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882535.49059: _low_level_execute_command(): starting 27885 1726882535.49062: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882535.398041-28280-139948539756409/AnsiballZ_command.py && sleep 0' 27885 1726882535.49641: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882535.49662: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882535.49760: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882535.64710: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nethtest0\nlo\npeerethtest0\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:35:35.643375", "end": "2024-09-20 21:35:35.646301", "delta": "0:00:00.002926", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27885 1726882535.66426: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 27885 1726882535.66430: stdout chunk (state=3): >>><<< 27885 1726882535.66433: stderr chunk (state=3): >>><<< 27885 1726882535.66435: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nethtest0\nlo\npeerethtest0\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:35:35.643375", "end": "2024-09-20 21:35:35.646301", "delta": "0:00:00.002926", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 27885 1726882535.66438: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882535.398041-28280-139948539756409/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27885 1726882535.66440: _low_level_execute_command(): starting 27885 1726882535.66442: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882535.398041-28280-139948539756409/ > /dev/null 2>&1 && sleep 0' 27885 1726882535.67305: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882535.67310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882535.67313: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882535.67316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882535.67341: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882535.67347: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882535.67435: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882535.69248: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882535.69263: stdout chunk (state=3): >>><<< 27885 1726882535.69281: stderr chunk (state=3): >>><<< 27885 1726882535.69304: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882535.69323: handler run complete 27885 1726882535.69352: Evaluated conditional (False): False 27885 1726882535.69375: attempt loop complete, returning result 27885 1726882535.69382: _execute() done 27885 1726882535.69398: dumping result to json 27885 1726882535.69416: done dumping result, returning 27885 1726882535.69441: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [12673a56-9f93-3fa5-01be-0000000002e0] 27885 1726882535.69459: sending task result for task 12673a56-9f93-3fa5-01be-0000000002e0 ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.002926", "end": "2024-09-20 21:35:35.646301", "rc": 0, "start": "2024-09-20 21:35:35.643375" } STDOUT: bonding_masters eth0 ethtest0 lo peerethtest0 rpltstbr 27885 1726882535.69819: no more pending results, returning what we have 27885 1726882535.69823: results queue empty 27885 1726882535.69825: checking for any_errors_fatal 27885 1726882535.69826: done checking for any_errors_fatal 27885 1726882535.69827: checking for max_fail_percentage 27885 1726882535.69828: done checking for max_fail_percentage 27885 1726882535.69829: checking to see if all hosts have failed and the running result is not ok 27885 1726882535.69830: done checking to see if all hosts have failed 27885 1726882535.69831: getting the remaining hosts for this loop 27885 1726882535.69832: done getting the remaining hosts for this loop 27885 1726882535.69836: getting the next task for host managed_node2 27885 1726882535.69844: done getting next task for host managed_node2 27885 1726882535.69846: ^ task is: TASK: Set current_interfaces 27885 1726882535.69851: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882535.69856: getting variables 27885 1726882535.69858: in VariableManager get_vars() 27885 1726882535.70024: Calling all_inventory to load vars for managed_node2 27885 1726882535.70027: Calling groups_inventory to load vars for managed_node2 27885 1726882535.70030: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882535.70064: Calling all_plugins_play to load vars for managed_node2 27885 1726882535.70068: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882535.70072: Calling groups_plugins_play to load vars for managed_node2 27885 1726882535.70451: done sending task result for task 12673a56-9f93-3fa5-01be-0000000002e0 27885 1726882535.70454: WORKER PROCESS EXITING 27885 1726882535.70476: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882535.70784: done with get_vars() 27885 1726882535.70797: done getting variables 27885 1726882535.70889: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:35:35 -0400 (0:00:00.357) 0:00:08.351 ****** 27885 1726882535.70935: entering _queue_task() for managed_node2/set_fact 27885 1726882535.71164: worker is 1 (out of 1 available) 27885 1726882535.71178: exiting _queue_task() for managed_node2/set_fact 27885 1726882535.71200: done queuing things up, now waiting for results queue to drain 27885 1726882535.71202: waiting for pending results... 27885 1726882535.71352: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 27885 1726882535.71420: in run() - task 12673a56-9f93-3fa5-01be-0000000002e1 27885 1726882535.71431: variable 'ansible_search_path' from source: unknown 27885 1726882535.71439: variable 'ansible_search_path' from source: unknown 27885 1726882535.71469: calling self._execute() 27885 1726882535.71534: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882535.71541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882535.71551: variable 'omit' from source: magic vars 27885 1726882535.71861: variable 'ansible_distribution_major_version' from source: facts 27885 1726882535.71874: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882535.71877: variable 'omit' from source: magic vars 27885 1726882535.71914: variable 'omit' from source: magic vars 27885 1726882535.71996: variable '_current_interfaces' from source: set_fact 27885 1726882535.72045: variable 'omit' from source: magic vars 27885 1726882535.72074: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882535.72108: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882535.72122: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882535.72135: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882535.72144: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882535.72166: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882535.72170: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882535.72172: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882535.72244: Set connection var ansible_pipelining to False 27885 1726882535.72247: Set connection var ansible_connection to ssh 27885 1726882535.72253: Set connection var ansible_timeout to 10 27885 1726882535.72255: Set connection var ansible_shell_type to sh 27885 1726882535.72260: Set connection var ansible_shell_executable to /bin/sh 27885 1726882535.72265: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882535.72282: variable 'ansible_shell_executable' from source: unknown 27885 1726882535.72285: variable 'ansible_connection' from source: unknown 27885 1726882535.72288: variable 'ansible_module_compression' from source: unknown 27885 1726882535.72294: variable 'ansible_shell_type' from source: unknown 27885 1726882535.72297: variable 'ansible_shell_executable' from source: unknown 27885 1726882535.72299: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882535.72303: variable 'ansible_pipelining' from source: unknown 27885 1726882535.72305: variable 'ansible_timeout' from source: unknown 27885 1726882535.72307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882535.72399: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882535.72405: variable 'omit' from source: magic vars 27885 1726882535.72411: starting attempt loop 27885 1726882535.72414: running the handler 27885 1726882535.72428: handler run complete 27885 1726882535.72434: attempt loop complete, returning result 27885 1726882535.72437: _execute() done 27885 1726882535.72439: dumping result to json 27885 1726882535.72444: done dumping result, returning 27885 1726882535.72450: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [12673a56-9f93-3fa5-01be-0000000002e1] 27885 1726882535.72454: sending task result for task 12673a56-9f93-3fa5-01be-0000000002e1 27885 1726882535.72530: done sending task result for task 12673a56-9f93-3fa5-01be-0000000002e1 27885 1726882535.72533: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "ethtest0", "lo", "peerethtest0", "rpltstbr" ] }, "changed": false } 27885 1726882535.72588: no more pending results, returning what we have 27885 1726882535.72595: results queue empty 27885 1726882535.72597: checking for any_errors_fatal 27885 1726882535.72603: done checking for any_errors_fatal 27885 1726882535.72603: checking for max_fail_percentage 27885 1726882535.72604: done checking for max_fail_percentage 27885 1726882535.72605: checking to see if all hosts have failed and the running result is not ok 27885 1726882535.72606: done checking to see if all hosts have failed 27885 1726882535.72607: getting the remaining hosts for this loop 27885 1726882535.72608: done getting the remaining hosts for this loop 27885 1726882535.72611: getting the next task for host managed_node2 27885 1726882535.72617: done getting next task for host managed_node2 27885 1726882535.72620: ^ task is: TASK: Show current_interfaces 27885 1726882535.72623: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882535.72626: getting variables 27885 1726882535.72627: in VariableManager get_vars() 27885 1726882535.72659: Calling all_inventory to load vars for managed_node2 27885 1726882535.72665: Calling groups_inventory to load vars for managed_node2 27885 1726882535.72667: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882535.72678: Calling all_plugins_play to load vars for managed_node2 27885 1726882535.72681: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882535.72686: Calling groups_plugins_play to load vars for managed_node2 27885 1726882535.72968: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882535.73173: done with get_vars() 27885 1726882535.73182: done getting variables 27885 1726882535.73236: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:35:35 -0400 (0:00:00.023) 0:00:08.375 ****** 27885 1726882535.73262: entering _queue_task() for managed_node2/debug 27885 1726882535.73589: worker is 1 (out of 1 available) 27885 1726882535.73602: exiting _queue_task() for managed_node2/debug 27885 1726882535.73617: done queuing things up, now waiting for results queue to drain 27885 1726882535.73619: waiting for pending results... 27885 1726882535.73773: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 27885 1726882535.73842: in run() - task 12673a56-9f93-3fa5-01be-000000000283 27885 1726882535.73853: variable 'ansible_search_path' from source: unknown 27885 1726882535.73856: variable 'ansible_search_path' from source: unknown 27885 1726882535.73881: calling self._execute() 27885 1726882535.73941: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882535.73944: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882535.73955: variable 'omit' from source: magic vars 27885 1726882535.74196: variable 'ansible_distribution_major_version' from source: facts 27885 1726882535.74203: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882535.74209: variable 'omit' from source: magic vars 27885 1726882535.74238: variable 'omit' from source: magic vars 27885 1726882535.74303: variable 'current_interfaces' from source: set_fact 27885 1726882535.74325: variable 'omit' from source: magic vars 27885 1726882535.74356: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882535.74381: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882535.74399: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882535.74411: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882535.74421: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882535.74445: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882535.74448: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882535.74450: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882535.74516: Set connection var ansible_pipelining to False 27885 1726882535.74519: Set connection var ansible_connection to ssh 27885 1726882535.74525: Set connection var ansible_timeout to 10 27885 1726882535.74527: Set connection var ansible_shell_type to sh 27885 1726882535.74533: Set connection var ansible_shell_executable to /bin/sh 27885 1726882535.74537: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882535.74557: variable 'ansible_shell_executable' from source: unknown 27885 1726882535.74560: variable 'ansible_connection' from source: unknown 27885 1726882535.74563: variable 'ansible_module_compression' from source: unknown 27885 1726882535.74565: variable 'ansible_shell_type' from source: unknown 27885 1726882535.74567: variable 'ansible_shell_executable' from source: unknown 27885 1726882535.74569: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882535.74571: variable 'ansible_pipelining' from source: unknown 27885 1726882535.74574: variable 'ansible_timeout' from source: unknown 27885 1726882535.74578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882535.74672: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882535.74680: variable 'omit' from source: magic vars 27885 1726882535.74685: starting attempt loop 27885 1726882535.74688: running the handler 27885 1726882535.74724: handler run complete 27885 1726882535.74734: attempt loop complete, returning result 27885 1726882535.74737: _execute() done 27885 1726882535.74739: dumping result to json 27885 1726882535.74742: done dumping result, returning 27885 1726882535.74748: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [12673a56-9f93-3fa5-01be-000000000283] 27885 1726882535.74752: sending task result for task 12673a56-9f93-3fa5-01be-000000000283 27885 1726882535.74831: done sending task result for task 12673a56-9f93-3fa5-01be-000000000283 27885 1726882535.74834: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'ethtest0', 'lo', 'peerethtest0', 'rpltstbr'] 27885 1726882535.74878: no more pending results, returning what we have 27885 1726882535.74880: results queue empty 27885 1726882535.74881: checking for any_errors_fatal 27885 1726882535.74885: done checking for any_errors_fatal 27885 1726882535.74885: checking for max_fail_percentage 27885 1726882535.74887: done checking for max_fail_percentage 27885 1726882535.74888: checking to see if all hosts have failed and the running result is not ok 27885 1726882535.74888: done checking to see if all hosts have failed 27885 1726882535.74891: getting the remaining hosts for this loop 27885 1726882535.74894: done getting the remaining hosts for this loop 27885 1726882535.74897: getting the next task for host managed_node2 27885 1726882535.74903: done getting next task for host managed_node2 27885 1726882535.74906: ^ task is: TASK: Manage test interface 27885 1726882535.74908: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882535.74911: getting variables 27885 1726882535.74912: in VariableManager get_vars() 27885 1726882535.74941: Calling all_inventory to load vars for managed_node2 27885 1726882535.74944: Calling groups_inventory to load vars for managed_node2 27885 1726882535.74946: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882535.74953: Calling all_plugins_play to load vars for managed_node2 27885 1726882535.74955: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882535.74958: Calling groups_plugins_play to load vars for managed_node2 27885 1726882535.75073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882535.75202: done with get_vars() 27885 1726882535.75209: done getting variables TASK [Manage test interface] *************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:28 Friday 20 September 2024 21:35:35 -0400 (0:00:00.020) 0:00:08.395 ****** 27885 1726882535.75264: entering _queue_task() for managed_node2/include_tasks 27885 1726882535.75435: worker is 1 (out of 1 available) 27885 1726882535.75447: exiting _queue_task() for managed_node2/include_tasks 27885 1726882535.75460: done queuing things up, now waiting for results queue to drain 27885 1726882535.75461: waiting for pending results... 27885 1726882535.75675: running TaskExecutor() for managed_node2/TASK: Manage test interface 27885 1726882535.75899: in run() - task 12673a56-9f93-3fa5-01be-000000000011 27885 1726882535.75902: variable 'ansible_search_path' from source: unknown 27885 1726882535.75905: calling self._execute() 27885 1726882535.75907: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882535.75910: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882535.75914: variable 'omit' from source: magic vars 27885 1726882535.76306: variable 'ansible_distribution_major_version' from source: facts 27885 1726882535.76323: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882535.76334: _execute() done 27885 1726882535.76345: dumping result to json 27885 1726882535.76357: done dumping result, returning 27885 1726882535.76369: done running TaskExecutor() for managed_node2/TASK: Manage test interface [12673a56-9f93-3fa5-01be-000000000011] 27885 1726882535.76387: sending task result for task 12673a56-9f93-3fa5-01be-000000000011 27885 1726882535.76601: no more pending results, returning what we have 27885 1726882535.76606: in VariableManager get_vars() 27885 1726882535.76648: Calling all_inventory to load vars for managed_node2 27885 1726882535.76651: Calling groups_inventory to load vars for managed_node2 27885 1726882535.76653: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882535.76665: Calling all_plugins_play to load vars for managed_node2 27885 1726882535.76668: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882535.76671: Calling groups_plugins_play to load vars for managed_node2 27885 1726882535.77032: done sending task result for task 12673a56-9f93-3fa5-01be-000000000011 27885 1726882535.77036: WORKER PROCESS EXITING 27885 1726882535.77045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882535.77175: done with get_vars() 27885 1726882535.77180: variable 'ansible_search_path' from source: unknown 27885 1726882535.77189: we have included files to process 27885 1726882535.77192: generating all_blocks data 27885 1726882535.77194: done generating all_blocks data 27885 1726882535.77197: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 27885 1726882535.77198: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 27885 1726882535.77200: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 27885 1726882535.77437: in VariableManager get_vars() 27885 1726882535.77450: done with get_vars() 27885 1726882535.77867: done processing included file 27885 1726882535.77868: iterating over new_blocks loaded from include file 27885 1726882535.77869: in VariableManager get_vars() 27885 1726882535.77880: done with get_vars() 27885 1726882535.77881: filtering new block on tags 27885 1726882535.77904: done filtering new block on tags 27885 1726882535.77906: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node2 27885 1726882535.77909: extending task lists for all hosts with included blocks 27885 1726882535.78538: done extending task lists 27885 1726882535.78539: done processing included files 27885 1726882535.78540: results queue empty 27885 1726882535.78541: checking for any_errors_fatal 27885 1726882535.78543: done checking for any_errors_fatal 27885 1726882535.78543: checking for max_fail_percentage 27885 1726882535.78544: done checking for max_fail_percentage 27885 1726882535.78545: checking to see if all hosts have failed and the running result is not ok 27885 1726882535.78546: done checking to see if all hosts have failed 27885 1726882535.78546: getting the remaining hosts for this loop 27885 1726882535.78547: done getting the remaining hosts for this loop 27885 1726882535.78550: getting the next task for host managed_node2 27885 1726882535.78553: done getting next task for host managed_node2 27885 1726882535.78555: ^ task is: TASK: Ensure state in ["present", "absent"] 27885 1726882535.78558: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882535.78560: getting variables 27885 1726882535.78560: in VariableManager get_vars() 27885 1726882535.78574: Calling all_inventory to load vars for managed_node2 27885 1726882535.78575: Calling groups_inventory to load vars for managed_node2 27885 1726882535.78576: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882535.78580: Calling all_plugins_play to load vars for managed_node2 27885 1726882535.78582: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882535.78583: Calling groups_plugins_play to load vars for managed_node2 27885 1726882535.78724: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882535.78941: done with get_vars() 27885 1726882535.78950: done getting variables 27885 1726882535.78984: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Friday 20 September 2024 21:35:35 -0400 (0:00:00.037) 0:00:08.432 ****** 27885 1726882535.79012: entering _queue_task() for managed_node2/fail 27885 1726882535.79250: worker is 1 (out of 1 available) 27885 1726882535.79260: exiting _queue_task() for managed_node2/fail 27885 1726882535.79275: done queuing things up, now waiting for results queue to drain 27885 1726882535.79276: waiting for pending results... 27885 1726882535.79533: running TaskExecutor() for managed_node2/TASK: Ensure state in ["present", "absent"] 27885 1726882535.79613: in run() - task 12673a56-9f93-3fa5-01be-0000000002fc 27885 1726882535.79639: variable 'ansible_search_path' from source: unknown 27885 1726882535.79643: variable 'ansible_search_path' from source: unknown 27885 1726882535.79661: calling self._execute() 27885 1726882535.79898: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882535.79902: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882535.79904: variable 'omit' from source: magic vars 27885 1726882535.80115: variable 'ansible_distribution_major_version' from source: facts 27885 1726882535.80132: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882535.80270: variable 'state' from source: include params 27885 1726882535.80281: Evaluated conditional (state not in ["present", "absent"]): False 27885 1726882535.80288: when evaluation is False, skipping this task 27885 1726882535.80296: _execute() done 27885 1726882535.80305: dumping result to json 27885 1726882535.80311: done dumping result, returning 27885 1726882535.80319: done running TaskExecutor() for managed_node2/TASK: Ensure state in ["present", "absent"] [12673a56-9f93-3fa5-01be-0000000002fc] 27885 1726882535.80327: sending task result for task 12673a56-9f93-3fa5-01be-0000000002fc 27885 1726882535.80427: done sending task result for task 12673a56-9f93-3fa5-01be-0000000002fc 27885 1726882535.80434: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 27885 1726882535.80507: no more pending results, returning what we have 27885 1726882535.80512: results queue empty 27885 1726882535.80513: checking for any_errors_fatal 27885 1726882535.80514: done checking for any_errors_fatal 27885 1726882535.80514: checking for max_fail_percentage 27885 1726882535.80516: done checking for max_fail_percentage 27885 1726882535.80516: checking to see if all hosts have failed and the running result is not ok 27885 1726882535.80517: done checking to see if all hosts have failed 27885 1726882535.80518: getting the remaining hosts for this loop 27885 1726882535.80519: done getting the remaining hosts for this loop 27885 1726882535.80523: getting the next task for host managed_node2 27885 1726882535.80529: done getting next task for host managed_node2 27885 1726882535.80532: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 27885 1726882535.80537: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882535.80541: getting variables 27885 1726882535.80543: in VariableManager get_vars() 27885 1726882535.80582: Calling all_inventory to load vars for managed_node2 27885 1726882535.80585: Calling groups_inventory to load vars for managed_node2 27885 1726882535.80588: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882535.80603: Calling all_plugins_play to load vars for managed_node2 27885 1726882535.80606: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882535.80609: Calling groups_plugins_play to load vars for managed_node2 27885 1726882535.80907: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882535.81048: done with get_vars() 27885 1726882535.81055: done getting variables 27885 1726882535.81096: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Friday 20 September 2024 21:35:35 -0400 (0:00:00.021) 0:00:08.453 ****** 27885 1726882535.81115: entering _queue_task() for managed_node2/fail 27885 1726882535.81297: worker is 1 (out of 1 available) 27885 1726882535.81310: exiting _queue_task() for managed_node2/fail 27885 1726882535.81323: done queuing things up, now waiting for results queue to drain 27885 1726882535.81324: waiting for pending results... 27885 1726882535.81473: running TaskExecutor() for managed_node2/TASK: Ensure type in ["dummy", "tap", "veth"] 27885 1726882535.81539: in run() - task 12673a56-9f93-3fa5-01be-0000000002fd 27885 1726882535.81551: variable 'ansible_search_path' from source: unknown 27885 1726882535.81554: variable 'ansible_search_path' from source: unknown 27885 1726882535.81580: calling self._execute() 27885 1726882535.81644: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882535.81649: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882535.81656: variable 'omit' from source: magic vars 27885 1726882535.81929: variable 'ansible_distribution_major_version' from source: facts 27885 1726882535.81938: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882535.82042: variable 'type' from source: set_fact 27885 1726882535.82045: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 27885 1726882535.82048: when evaluation is False, skipping this task 27885 1726882535.82051: _execute() done 27885 1726882535.82053: dumping result to json 27885 1726882535.82057: done dumping result, returning 27885 1726882535.82063: done running TaskExecutor() for managed_node2/TASK: Ensure type in ["dummy", "tap", "veth"] [12673a56-9f93-3fa5-01be-0000000002fd] 27885 1726882535.82068: sending task result for task 12673a56-9f93-3fa5-01be-0000000002fd 27885 1726882535.82147: done sending task result for task 12673a56-9f93-3fa5-01be-0000000002fd 27885 1726882535.82150: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 27885 1726882535.82203: no more pending results, returning what we have 27885 1726882535.82206: results queue empty 27885 1726882535.82207: checking for any_errors_fatal 27885 1726882535.82211: done checking for any_errors_fatal 27885 1726882535.82212: checking for max_fail_percentage 27885 1726882535.82213: done checking for max_fail_percentage 27885 1726882535.82214: checking to see if all hosts have failed and the running result is not ok 27885 1726882535.82214: done checking to see if all hosts have failed 27885 1726882535.82215: getting the remaining hosts for this loop 27885 1726882535.82216: done getting the remaining hosts for this loop 27885 1726882535.82219: getting the next task for host managed_node2 27885 1726882535.82223: done getting next task for host managed_node2 27885 1726882535.82226: ^ task is: TASK: Include the task 'show_interfaces.yml' 27885 1726882535.82229: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882535.82232: getting variables 27885 1726882535.82233: in VariableManager get_vars() 27885 1726882535.82265: Calling all_inventory to load vars for managed_node2 27885 1726882535.82268: Calling groups_inventory to load vars for managed_node2 27885 1726882535.82270: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882535.82278: Calling all_plugins_play to load vars for managed_node2 27885 1726882535.82280: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882535.82283: Calling groups_plugins_play to load vars for managed_node2 27885 1726882535.82433: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882535.82563: done with get_vars() 27885 1726882535.82575: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Friday 20 September 2024 21:35:35 -0400 (0:00:00.015) 0:00:08.469 ****** 27885 1726882535.82676: entering _queue_task() for managed_node2/include_tasks 27885 1726882535.82915: worker is 1 (out of 1 available) 27885 1726882535.82930: exiting _queue_task() for managed_node2/include_tasks 27885 1726882535.82942: done queuing things up, now waiting for results queue to drain 27885 1726882535.82943: waiting for pending results... 27885 1726882535.83200: running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' 27885 1726882535.83401: in run() - task 12673a56-9f93-3fa5-01be-0000000002fe 27885 1726882535.83405: variable 'ansible_search_path' from source: unknown 27885 1726882535.83407: variable 'ansible_search_path' from source: unknown 27885 1726882535.83415: calling self._execute() 27885 1726882535.83541: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882535.83547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882535.83561: variable 'omit' from source: magic vars 27885 1726882535.83835: variable 'ansible_distribution_major_version' from source: facts 27885 1726882535.83841: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882535.83848: _execute() done 27885 1726882535.83851: dumping result to json 27885 1726882535.83853: done dumping result, returning 27885 1726882535.83859: done running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' [12673a56-9f93-3fa5-01be-0000000002fe] 27885 1726882535.83864: sending task result for task 12673a56-9f93-3fa5-01be-0000000002fe 27885 1726882535.83942: done sending task result for task 12673a56-9f93-3fa5-01be-0000000002fe 27885 1726882535.83945: WORKER PROCESS EXITING 27885 1726882535.83969: no more pending results, returning what we have 27885 1726882535.83973: in VariableManager get_vars() 27885 1726882535.84018: Calling all_inventory to load vars for managed_node2 27885 1726882535.84020: Calling groups_inventory to load vars for managed_node2 27885 1726882535.84023: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882535.84031: Calling all_plugins_play to load vars for managed_node2 27885 1726882535.84033: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882535.84035: Calling groups_plugins_play to load vars for managed_node2 27885 1726882535.84165: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882535.84288: done with get_vars() 27885 1726882535.84296: variable 'ansible_search_path' from source: unknown 27885 1726882535.84298: variable 'ansible_search_path' from source: unknown 27885 1726882535.84320: we have included files to process 27885 1726882535.84321: generating all_blocks data 27885 1726882535.84322: done generating all_blocks data 27885 1726882535.84326: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 27885 1726882535.84326: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 27885 1726882535.84327: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 27885 1726882535.84388: in VariableManager get_vars() 27885 1726882535.84406: done with get_vars() 27885 1726882535.84475: done processing included file 27885 1726882535.84476: iterating over new_blocks loaded from include file 27885 1726882535.84477: in VariableManager get_vars() 27885 1726882535.84488: done with get_vars() 27885 1726882535.84489: filtering new block on tags 27885 1726882535.84502: done filtering new block on tags 27885 1726882535.84504: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 27885 1726882535.84507: extending task lists for all hosts with included blocks 27885 1726882535.84752: done extending task lists 27885 1726882535.84753: done processing included files 27885 1726882535.84754: results queue empty 27885 1726882535.84754: checking for any_errors_fatal 27885 1726882535.84756: done checking for any_errors_fatal 27885 1726882535.84756: checking for max_fail_percentage 27885 1726882535.84757: done checking for max_fail_percentage 27885 1726882535.84758: checking to see if all hosts have failed and the running result is not ok 27885 1726882535.84758: done checking to see if all hosts have failed 27885 1726882535.84758: getting the remaining hosts for this loop 27885 1726882535.84759: done getting the remaining hosts for this loop 27885 1726882535.84760: getting the next task for host managed_node2 27885 1726882535.84763: done getting next task for host managed_node2 27885 1726882535.84764: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 27885 1726882535.84767: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882535.84768: getting variables 27885 1726882535.84769: in VariableManager get_vars() 27885 1726882535.84777: Calling all_inventory to load vars for managed_node2 27885 1726882535.84778: Calling groups_inventory to load vars for managed_node2 27885 1726882535.84780: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882535.84783: Calling all_plugins_play to load vars for managed_node2 27885 1726882535.84784: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882535.84786: Calling groups_plugins_play to load vars for managed_node2 27885 1726882535.84873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882535.84995: done with get_vars() 27885 1726882535.85002: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:35:35 -0400 (0:00:00.023) 0:00:08.492 ****** 27885 1726882535.85045: entering _queue_task() for managed_node2/include_tasks 27885 1726882535.85206: worker is 1 (out of 1 available) 27885 1726882535.85219: exiting _queue_task() for managed_node2/include_tasks 27885 1726882535.85231: done queuing things up, now waiting for results queue to drain 27885 1726882535.85232: waiting for pending results... 27885 1726882535.85371: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 27885 1726882535.85436: in run() - task 12673a56-9f93-3fa5-01be-000000000374 27885 1726882535.85447: variable 'ansible_search_path' from source: unknown 27885 1726882535.85450: variable 'ansible_search_path' from source: unknown 27885 1726882535.85479: calling self._execute() 27885 1726882535.85536: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882535.85542: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882535.85549: variable 'omit' from source: magic vars 27885 1726882535.85791: variable 'ansible_distribution_major_version' from source: facts 27885 1726882535.85807: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882535.85813: _execute() done 27885 1726882535.85815: dumping result to json 27885 1726882535.85818: done dumping result, returning 27885 1726882535.85824: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [12673a56-9f93-3fa5-01be-000000000374] 27885 1726882535.85829: sending task result for task 12673a56-9f93-3fa5-01be-000000000374 27885 1726882535.85909: done sending task result for task 12673a56-9f93-3fa5-01be-000000000374 27885 1726882535.85912: WORKER PROCESS EXITING 27885 1726882535.85937: no more pending results, returning what we have 27885 1726882535.85940: in VariableManager get_vars() 27885 1726882535.85974: Calling all_inventory to load vars for managed_node2 27885 1726882535.85976: Calling groups_inventory to load vars for managed_node2 27885 1726882535.85978: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882535.85987: Calling all_plugins_play to load vars for managed_node2 27885 1726882535.85989: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882535.85992: Calling groups_plugins_play to load vars for managed_node2 27885 1726882535.86138: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882535.86258: done with get_vars() 27885 1726882535.86264: variable 'ansible_search_path' from source: unknown 27885 1726882535.86264: variable 'ansible_search_path' from source: unknown 27885 1726882535.86301: we have included files to process 27885 1726882535.86302: generating all_blocks data 27885 1726882535.86303: done generating all_blocks data 27885 1726882535.86303: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 27885 1726882535.86304: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 27885 1726882535.86305: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 27885 1726882535.86463: done processing included file 27885 1726882535.86465: iterating over new_blocks loaded from include file 27885 1726882535.86466: in VariableManager get_vars() 27885 1726882535.86478: done with get_vars() 27885 1726882535.86479: filtering new block on tags 27885 1726882535.86490: done filtering new block on tags 27885 1726882535.86492: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 27885 1726882535.86496: extending task lists for all hosts with included blocks 27885 1726882535.86577: done extending task lists 27885 1726882535.86578: done processing included files 27885 1726882535.86579: results queue empty 27885 1726882535.86579: checking for any_errors_fatal 27885 1726882535.86580: done checking for any_errors_fatal 27885 1726882535.86581: checking for max_fail_percentage 27885 1726882535.86581: done checking for max_fail_percentage 27885 1726882535.86582: checking to see if all hosts have failed and the running result is not ok 27885 1726882535.86582: done checking to see if all hosts have failed 27885 1726882535.86583: getting the remaining hosts for this loop 27885 1726882535.86583: done getting the remaining hosts for this loop 27885 1726882535.86585: getting the next task for host managed_node2 27885 1726882535.86587: done getting next task for host managed_node2 27885 1726882535.86589: ^ task is: TASK: Gather current interface info 27885 1726882535.86592: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882535.86595: getting variables 27885 1726882535.86596: in VariableManager get_vars() 27885 1726882535.86603: Calling all_inventory to load vars for managed_node2 27885 1726882535.86605: Calling groups_inventory to load vars for managed_node2 27885 1726882535.86606: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882535.86609: Calling all_plugins_play to load vars for managed_node2 27885 1726882535.86610: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882535.86612: Calling groups_plugins_play to load vars for managed_node2 27885 1726882535.86699: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882535.86833: done with get_vars() 27885 1726882535.86839: done getting variables 27885 1726882535.86863: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:35:35 -0400 (0:00:00.018) 0:00:08.511 ****** 27885 1726882535.86882: entering _queue_task() for managed_node2/command 27885 1726882535.87039: worker is 1 (out of 1 available) 27885 1726882535.87051: exiting _queue_task() for managed_node2/command 27885 1726882535.87061: done queuing things up, now waiting for results queue to drain 27885 1726882535.87062: waiting for pending results... 27885 1726882535.87308: running TaskExecutor() for managed_node2/TASK: Gather current interface info 27885 1726882535.87313: in run() - task 12673a56-9f93-3fa5-01be-0000000003ab 27885 1726882535.87317: variable 'ansible_search_path' from source: unknown 27885 1726882535.87319: variable 'ansible_search_path' from source: unknown 27885 1726882535.87322: calling self._execute() 27885 1726882535.87352: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882535.87357: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882535.87365: variable 'omit' from source: magic vars 27885 1726882535.87594: variable 'ansible_distribution_major_version' from source: facts 27885 1726882535.87606: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882535.87612: variable 'omit' from source: magic vars 27885 1726882535.87647: variable 'omit' from source: magic vars 27885 1726882535.87669: variable 'omit' from source: magic vars 27885 1726882535.87700: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882535.87731: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882535.87741: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882535.87754: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882535.87764: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882535.87785: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882535.87788: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882535.87790: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882535.87867: Set connection var ansible_pipelining to False 27885 1726882535.87870: Set connection var ansible_connection to ssh 27885 1726882535.87875: Set connection var ansible_timeout to 10 27885 1726882535.87878: Set connection var ansible_shell_type to sh 27885 1726882535.87883: Set connection var ansible_shell_executable to /bin/sh 27885 1726882535.87888: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882535.87909: variable 'ansible_shell_executable' from source: unknown 27885 1726882535.87912: variable 'ansible_connection' from source: unknown 27885 1726882535.87915: variable 'ansible_module_compression' from source: unknown 27885 1726882535.87918: variable 'ansible_shell_type' from source: unknown 27885 1726882535.87920: variable 'ansible_shell_executable' from source: unknown 27885 1726882535.87922: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882535.87924: variable 'ansible_pipelining' from source: unknown 27885 1726882535.87926: variable 'ansible_timeout' from source: unknown 27885 1726882535.87931: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882535.88025: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882535.88033: variable 'omit' from source: magic vars 27885 1726882535.88038: starting attempt loop 27885 1726882535.88041: running the handler 27885 1726882535.88058: _low_level_execute_command(): starting 27885 1726882535.88061: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27885 1726882535.88934: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882535.88971: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882535.89087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882535.89099: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882535.89146: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882535.89217: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882535.90822: stdout chunk (state=3): >>>/root <<< 27885 1726882535.90981: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882535.90984: stdout chunk (state=3): >>><<< 27885 1726882535.90987: stderr chunk (state=3): >>><<< 27885 1726882535.91101: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882535.91105: _low_level_execute_command(): starting 27885 1726882535.91108: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882535.9101217-28311-61794970032863 `" && echo ansible-tmp-1726882535.9101217-28311-61794970032863="` echo /root/.ansible/tmp/ansible-tmp-1726882535.9101217-28311-61794970032863 `" ) && sleep 0' 27885 1726882535.91498: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882535.91513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882535.91526: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882535.91573: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882535.91585: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882535.91651: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882535.93513: stdout chunk (state=3): >>>ansible-tmp-1726882535.9101217-28311-61794970032863=/root/.ansible/tmp/ansible-tmp-1726882535.9101217-28311-61794970032863 <<< 27885 1726882535.93621: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882535.93641: stderr chunk (state=3): >>><<< 27885 1726882535.93644: stdout chunk (state=3): >>><<< 27885 1726882535.93657: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882535.9101217-28311-61794970032863=/root/.ansible/tmp/ansible-tmp-1726882535.9101217-28311-61794970032863 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882535.93680: variable 'ansible_module_compression' from source: unknown 27885 1726882535.93724: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27885_1t3g5hz/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27885 1726882535.93751: variable 'ansible_facts' from source: unknown 27885 1726882535.93810: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882535.9101217-28311-61794970032863/AnsiballZ_command.py 27885 1726882535.93902: Sending initial data 27885 1726882535.93905: Sent initial data (155 bytes) 27885 1726882535.94319: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882535.94322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882535.94324: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882535.94326: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882535.94328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882535.94377: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882535.94380: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882535.94443: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882535.95960: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 27885 1726882535.95964: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27885 1726882535.96019: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27885 1726882535.96081: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpgqej5im4 /root/.ansible/tmp/ansible-tmp-1726882535.9101217-28311-61794970032863/AnsiballZ_command.py <<< 27885 1726882535.96086: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882535.9101217-28311-61794970032863/AnsiballZ_command.py" <<< 27885 1726882535.96139: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpgqej5im4" to remote "/root/.ansible/tmp/ansible-tmp-1726882535.9101217-28311-61794970032863/AnsiballZ_command.py" <<< 27885 1726882535.96142: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882535.9101217-28311-61794970032863/AnsiballZ_command.py" <<< 27885 1726882535.96745: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882535.96783: stderr chunk (state=3): >>><<< 27885 1726882535.96786: stdout chunk (state=3): >>><<< 27885 1726882535.96829: done transferring module to remote 27885 1726882535.96840: _low_level_execute_command(): starting 27885 1726882535.96843: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882535.9101217-28311-61794970032863/ /root/.ansible/tmp/ansible-tmp-1726882535.9101217-28311-61794970032863/AnsiballZ_command.py && sleep 0' 27885 1726882535.97298: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882535.97302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882535.97304: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration <<< 27885 1726882535.97306: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 27885 1726882535.97309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882535.97347: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882535.97350: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882535.97418: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882535.99128: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882535.99148: stderr chunk (state=3): >>><<< 27885 1726882535.99151: stdout chunk (state=3): >>><<< 27885 1726882535.99165: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882535.99168: _low_level_execute_command(): starting 27885 1726882535.99173: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882535.9101217-28311-61794970032863/AnsiballZ_command.py && sleep 0' 27885 1726882535.99605: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882535.99609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 27885 1726882535.99611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 27885 1726882535.99613: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882535.99615: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882535.99660: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882535.99663: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882535.99733: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882536.15032: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nethtest0\nlo\npeerethtest0\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:35:36.145094", "end": "2024-09-20 21:35:36.147937", "delta": "0:00:00.002843", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27885 1726882536.16303: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882536.16330: stderr chunk (state=3): >>>Shared connection to 10.31.14.69 closed. <<< 27885 1726882536.16383: stderr chunk (state=3): >>><<< 27885 1726882536.16406: stdout chunk (state=3): >>><<< 27885 1726882536.16440: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nethtest0\nlo\npeerethtest0\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:35:36.145094", "end": "2024-09-20 21:35:36.147937", "delta": "0:00:00.002843", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 27885 1726882536.16571: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882535.9101217-28311-61794970032863/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27885 1726882536.16574: _low_level_execute_command(): starting 27885 1726882536.16577: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882535.9101217-28311-61794970032863/ > /dev/null 2>&1 && sleep 0' 27885 1726882536.17161: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882536.17177: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882536.17209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882536.17242: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882536.17267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882536.17357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882536.17387: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882536.17484: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882536.19265: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882536.19288: stderr chunk (state=3): >>><<< 27885 1726882536.19296: stdout chunk (state=3): >>><<< 27885 1726882536.19307: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882536.19313: handler run complete 27885 1726882536.19330: Evaluated conditional (False): False 27885 1726882536.19342: attempt loop complete, returning result 27885 1726882536.19346: _execute() done 27885 1726882536.19348: dumping result to json 27885 1726882536.19350: done dumping result, returning 27885 1726882536.19358: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [12673a56-9f93-3fa5-01be-0000000003ab] 27885 1726882536.19363: sending task result for task 12673a56-9f93-3fa5-01be-0000000003ab 27885 1726882536.19466: done sending task result for task 12673a56-9f93-3fa5-01be-0000000003ab 27885 1726882536.19469: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.002843", "end": "2024-09-20 21:35:36.147937", "rc": 0, "start": "2024-09-20 21:35:36.145094" } STDOUT: bonding_masters eth0 ethtest0 lo peerethtest0 rpltstbr 27885 1726882536.19545: no more pending results, returning what we have 27885 1726882536.19548: results queue empty 27885 1726882536.19549: checking for any_errors_fatal 27885 1726882536.19550: done checking for any_errors_fatal 27885 1726882536.19551: checking for max_fail_percentage 27885 1726882536.19552: done checking for max_fail_percentage 27885 1726882536.19555: checking to see if all hosts have failed and the running result is not ok 27885 1726882536.19555: done checking to see if all hosts have failed 27885 1726882536.19556: getting the remaining hosts for this loop 27885 1726882536.19557: done getting the remaining hosts for this loop 27885 1726882536.19561: getting the next task for host managed_node2 27885 1726882536.19567: done getting next task for host managed_node2 27885 1726882536.19569: ^ task is: TASK: Set current_interfaces 27885 1726882536.19575: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882536.19580: getting variables 27885 1726882536.19581: in VariableManager get_vars() 27885 1726882536.19620: Calling all_inventory to load vars for managed_node2 27885 1726882536.19623: Calling groups_inventory to load vars for managed_node2 27885 1726882536.19624: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882536.19633: Calling all_plugins_play to load vars for managed_node2 27885 1726882536.19635: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882536.19638: Calling groups_plugins_play to load vars for managed_node2 27885 1726882536.19779: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882536.19917: done with get_vars() 27885 1726882536.19927: done getting variables 27885 1726882536.19966: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:35:36 -0400 (0:00:00.331) 0:00:08.842 ****** 27885 1726882536.19987: entering _queue_task() for managed_node2/set_fact 27885 1726882536.20181: worker is 1 (out of 1 available) 27885 1726882536.20196: exiting _queue_task() for managed_node2/set_fact 27885 1726882536.20214: done queuing things up, now waiting for results queue to drain 27885 1726882536.20216: waiting for pending results... 27885 1726882536.20611: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 27885 1726882536.20616: in run() - task 12673a56-9f93-3fa5-01be-0000000003ac 27885 1726882536.20619: variable 'ansible_search_path' from source: unknown 27885 1726882536.20621: variable 'ansible_search_path' from source: unknown 27885 1726882536.20624: calling self._execute() 27885 1726882536.20719: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882536.20731: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882536.20748: variable 'omit' from source: magic vars 27885 1726882536.21229: variable 'ansible_distribution_major_version' from source: facts 27885 1726882536.21243: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882536.21254: variable 'omit' from source: magic vars 27885 1726882536.21311: variable 'omit' from source: magic vars 27885 1726882536.21387: variable '_current_interfaces' from source: set_fact 27885 1726882536.21435: variable 'omit' from source: magic vars 27885 1726882536.21465: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882536.21496: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882536.21510: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882536.21523: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882536.21532: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882536.21554: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882536.21557: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882536.21560: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882536.21633: Set connection var ansible_pipelining to False 27885 1726882536.21636: Set connection var ansible_connection to ssh 27885 1726882536.21641: Set connection var ansible_timeout to 10 27885 1726882536.21644: Set connection var ansible_shell_type to sh 27885 1726882536.21649: Set connection var ansible_shell_executable to /bin/sh 27885 1726882536.21653: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882536.21670: variable 'ansible_shell_executable' from source: unknown 27885 1726882536.21674: variable 'ansible_connection' from source: unknown 27885 1726882536.21676: variable 'ansible_module_compression' from source: unknown 27885 1726882536.21678: variable 'ansible_shell_type' from source: unknown 27885 1726882536.21680: variable 'ansible_shell_executable' from source: unknown 27885 1726882536.21682: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882536.21686: variable 'ansible_pipelining' from source: unknown 27885 1726882536.21688: variable 'ansible_timeout' from source: unknown 27885 1726882536.21695: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882536.21785: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882536.21796: variable 'omit' from source: magic vars 27885 1726882536.21802: starting attempt loop 27885 1726882536.21810: running the handler 27885 1726882536.21817: handler run complete 27885 1726882536.21825: attempt loop complete, returning result 27885 1726882536.21829: _execute() done 27885 1726882536.21831: dumping result to json 27885 1726882536.21834: done dumping result, returning 27885 1726882536.21841: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [12673a56-9f93-3fa5-01be-0000000003ac] 27885 1726882536.21846: sending task result for task 12673a56-9f93-3fa5-01be-0000000003ac 27885 1726882536.21926: done sending task result for task 12673a56-9f93-3fa5-01be-0000000003ac 27885 1726882536.21929: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "ethtest0", "lo", "peerethtest0", "rpltstbr" ] }, "changed": false } 27885 1726882536.21984: no more pending results, returning what we have 27885 1726882536.21986: results queue empty 27885 1726882536.21987: checking for any_errors_fatal 27885 1726882536.21999: done checking for any_errors_fatal 27885 1726882536.21999: checking for max_fail_percentage 27885 1726882536.22001: done checking for max_fail_percentage 27885 1726882536.22002: checking to see if all hosts have failed and the running result is not ok 27885 1726882536.22003: done checking to see if all hosts have failed 27885 1726882536.22003: getting the remaining hosts for this loop 27885 1726882536.22004: done getting the remaining hosts for this loop 27885 1726882536.22007: getting the next task for host managed_node2 27885 1726882536.22013: done getting next task for host managed_node2 27885 1726882536.22015: ^ task is: TASK: Show current_interfaces 27885 1726882536.22020: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882536.22023: getting variables 27885 1726882536.22024: in VariableManager get_vars() 27885 1726882536.22056: Calling all_inventory to load vars for managed_node2 27885 1726882536.22058: Calling groups_inventory to load vars for managed_node2 27885 1726882536.22060: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882536.22068: Calling all_plugins_play to load vars for managed_node2 27885 1726882536.22071: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882536.22074: Calling groups_plugins_play to load vars for managed_node2 27885 1726882536.22222: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882536.22347: done with get_vars() 27885 1726882536.22354: done getting variables 27885 1726882536.22392: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:35:36 -0400 (0:00:00.024) 0:00:08.866 ****** 27885 1726882536.22415: entering _queue_task() for managed_node2/debug 27885 1726882536.22586: worker is 1 (out of 1 available) 27885 1726882536.22605: exiting _queue_task() for managed_node2/debug 27885 1726882536.22617: done queuing things up, now waiting for results queue to drain 27885 1726882536.22618: waiting for pending results... 27885 1726882536.22755: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 27885 1726882536.22814: in run() - task 12673a56-9f93-3fa5-01be-000000000375 27885 1726882536.22825: variable 'ansible_search_path' from source: unknown 27885 1726882536.22828: variable 'ansible_search_path' from source: unknown 27885 1726882536.22856: calling self._execute() 27885 1726882536.22915: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882536.22919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882536.22927: variable 'omit' from source: magic vars 27885 1726882536.23183: variable 'ansible_distribution_major_version' from source: facts 27885 1726882536.23218: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882536.23221: variable 'omit' from source: magic vars 27885 1726882536.23297: variable 'omit' from source: magic vars 27885 1726882536.23371: variable 'current_interfaces' from source: set_fact 27885 1726882536.23374: variable 'omit' from source: magic vars 27885 1726882536.23400: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882536.23432: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882536.23454: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882536.23501: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882536.23504: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882536.23507: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882536.23511: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882536.23513: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882536.23615: Set connection var ansible_pipelining to False 27885 1726882536.23620: Set connection var ansible_connection to ssh 27885 1726882536.23699: Set connection var ansible_timeout to 10 27885 1726882536.23703: Set connection var ansible_shell_type to sh 27885 1726882536.23705: Set connection var ansible_shell_executable to /bin/sh 27885 1726882536.23707: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882536.23710: variable 'ansible_shell_executable' from source: unknown 27885 1726882536.23712: variable 'ansible_connection' from source: unknown 27885 1726882536.23714: variable 'ansible_module_compression' from source: unknown 27885 1726882536.23716: variable 'ansible_shell_type' from source: unknown 27885 1726882536.23718: variable 'ansible_shell_executable' from source: unknown 27885 1726882536.23720: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882536.23722: variable 'ansible_pipelining' from source: unknown 27885 1726882536.23723: variable 'ansible_timeout' from source: unknown 27885 1726882536.23725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882536.23797: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882536.23822: variable 'omit' from source: magic vars 27885 1726882536.23825: starting attempt loop 27885 1726882536.23827: running the handler 27885 1726882536.23928: handler run complete 27885 1726882536.23931: attempt loop complete, returning result 27885 1726882536.23933: _execute() done 27885 1726882536.23934: dumping result to json 27885 1726882536.23936: done dumping result, returning 27885 1726882536.23938: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [12673a56-9f93-3fa5-01be-000000000375] 27885 1726882536.23940: sending task result for task 12673a56-9f93-3fa5-01be-000000000375 27885 1726882536.24003: done sending task result for task 12673a56-9f93-3fa5-01be-000000000375 27885 1726882536.24006: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'ethtest0', 'lo', 'peerethtest0', 'rpltstbr'] 27885 1726882536.24041: no more pending results, returning what we have 27885 1726882536.24043: results queue empty 27885 1726882536.24044: checking for any_errors_fatal 27885 1726882536.24049: done checking for any_errors_fatal 27885 1726882536.24050: checking for max_fail_percentage 27885 1726882536.24051: done checking for max_fail_percentage 27885 1726882536.24052: checking to see if all hosts have failed and the running result is not ok 27885 1726882536.24053: done checking to see if all hosts have failed 27885 1726882536.24053: getting the remaining hosts for this loop 27885 1726882536.24054: done getting the remaining hosts for this loop 27885 1726882536.24057: getting the next task for host managed_node2 27885 1726882536.24063: done getting next task for host managed_node2 27885 1726882536.24066: ^ task is: TASK: Install iproute 27885 1726882536.24068: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882536.24072: getting variables 27885 1726882536.24073: in VariableManager get_vars() 27885 1726882536.24123: Calling all_inventory to load vars for managed_node2 27885 1726882536.24126: Calling groups_inventory to load vars for managed_node2 27885 1726882536.24128: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882536.24137: Calling all_plugins_play to load vars for managed_node2 27885 1726882536.24140: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882536.24143: Calling groups_plugins_play to load vars for managed_node2 27885 1726882536.24362: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882536.24599: done with get_vars() 27885 1726882536.24610: done getting variables 27885 1726882536.24670: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Friday 20 September 2024 21:35:36 -0400 (0:00:00.022) 0:00:08.889 ****** 27885 1726882536.24699: entering _queue_task() for managed_node2/package 27885 1726882536.24926: worker is 1 (out of 1 available) 27885 1726882536.24938: exiting _queue_task() for managed_node2/package 27885 1726882536.24950: done queuing things up, now waiting for results queue to drain 27885 1726882536.24951: waiting for pending results... 27885 1726882536.25114: running TaskExecutor() for managed_node2/TASK: Install iproute 27885 1726882536.25172: in run() - task 12673a56-9f93-3fa5-01be-0000000002ff 27885 1726882536.25184: variable 'ansible_search_path' from source: unknown 27885 1726882536.25190: variable 'ansible_search_path' from source: unknown 27885 1726882536.25221: calling self._execute() 27885 1726882536.25281: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882536.25286: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882536.25297: variable 'omit' from source: magic vars 27885 1726882536.25614: variable 'ansible_distribution_major_version' from source: facts 27885 1726882536.25624: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882536.25628: variable 'omit' from source: magic vars 27885 1726882536.25654: variable 'omit' from source: magic vars 27885 1726882536.25782: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27885 1726882536.27598: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27885 1726882536.27611: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27885 1726882536.27661: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27885 1726882536.27705: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27885 1726882536.27739: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27885 1726882536.27831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882536.27865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882536.27899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882536.27944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882536.27965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882536.28142: variable '__network_is_ostree' from source: set_fact 27885 1726882536.28145: variable 'omit' from source: magic vars 27885 1726882536.28147: variable 'omit' from source: magic vars 27885 1726882536.28150: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882536.28187: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882536.28191: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882536.28206: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882536.28220: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882536.28241: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882536.28244: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882536.28246: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882536.28323: Set connection var ansible_pipelining to False 27885 1726882536.28327: Set connection var ansible_connection to ssh 27885 1726882536.28332: Set connection var ansible_timeout to 10 27885 1726882536.28335: Set connection var ansible_shell_type to sh 27885 1726882536.28339: Set connection var ansible_shell_executable to /bin/sh 27885 1726882536.28344: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882536.28361: variable 'ansible_shell_executable' from source: unknown 27885 1726882536.28364: variable 'ansible_connection' from source: unknown 27885 1726882536.28367: variable 'ansible_module_compression' from source: unknown 27885 1726882536.28369: variable 'ansible_shell_type' from source: unknown 27885 1726882536.28371: variable 'ansible_shell_executable' from source: unknown 27885 1726882536.28373: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882536.28377: variable 'ansible_pipelining' from source: unknown 27885 1726882536.28379: variable 'ansible_timeout' from source: unknown 27885 1726882536.28392: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882536.28596: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882536.28602: variable 'omit' from source: magic vars 27885 1726882536.28605: starting attempt loop 27885 1726882536.28607: running the handler 27885 1726882536.28609: variable 'ansible_facts' from source: unknown 27885 1726882536.28611: variable 'ansible_facts' from source: unknown 27885 1726882536.28613: _low_level_execute_command(): starting 27885 1726882536.28615: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27885 1726882536.29300: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882536.29334: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882536.29355: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882536.29369: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882536.29476: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882536.31034: stdout chunk (state=3): >>>/root <<< 27885 1726882536.31148: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882536.31170: stdout chunk (state=3): >>><<< 27885 1726882536.31172: stderr chunk (state=3): >>><<< 27885 1726882536.31184: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882536.31259: _low_level_execute_command(): starting 27885 1726882536.31263: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882536.3119671-28334-66915951592834 `" && echo ansible-tmp-1726882536.3119671-28334-66915951592834="` echo /root/.ansible/tmp/ansible-tmp-1726882536.3119671-28334-66915951592834 `" ) && sleep 0' 27885 1726882536.31983: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882536.33840: stdout chunk (state=3): >>>ansible-tmp-1726882536.3119671-28334-66915951592834=/root/.ansible/tmp/ansible-tmp-1726882536.3119671-28334-66915951592834 <<< 27885 1726882536.33976: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882536.33985: stdout chunk (state=3): >>><<< 27885 1726882536.33996: stderr chunk (state=3): >>><<< 27885 1726882536.34020: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882536.3119671-28334-66915951592834=/root/.ansible/tmp/ansible-tmp-1726882536.3119671-28334-66915951592834 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882536.34119: variable 'ansible_module_compression' from source: unknown 27885 1726882536.34121: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27885_1t3g5hz/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 27885 1726882536.34159: variable 'ansible_facts' from source: unknown 27885 1726882536.34290: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882536.3119671-28334-66915951592834/AnsiballZ_dnf.py 27885 1726882536.34463: Sending initial data 27885 1726882536.34472: Sent initial data (151 bytes) 27885 1726882536.35112: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882536.35127: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882536.35217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882536.35252: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882536.35283: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882536.35371: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882536.36913: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27885 1726882536.37002: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27885 1726882536.37085: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpkyzo4kqp /root/.ansible/tmp/ansible-tmp-1726882536.3119671-28334-66915951592834/AnsiballZ_dnf.py <<< 27885 1726882536.37088: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882536.3119671-28334-66915951592834/AnsiballZ_dnf.py" <<< 27885 1726882536.37188: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpkyzo4kqp" to remote "/root/.ansible/tmp/ansible-tmp-1726882536.3119671-28334-66915951592834/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882536.3119671-28334-66915951592834/AnsiballZ_dnf.py" <<< 27885 1726882536.38302: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882536.38369: stderr chunk (state=3): >>><<< 27885 1726882536.38399: stdout chunk (state=3): >>><<< 27885 1726882536.38407: done transferring module to remote 27885 1726882536.38424: _low_level_execute_command(): starting 27885 1726882536.38434: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882536.3119671-28334-66915951592834/ /root/.ansible/tmp/ansible-tmp-1726882536.3119671-28334-66915951592834/AnsiballZ_dnf.py && sleep 0' 27885 1726882536.39012: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882536.39026: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882536.39042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882536.39060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882536.39082: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882536.39173: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882536.39195: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882536.39211: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882536.39304: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882536.41011: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882536.41065: stderr chunk (state=3): >>><<< 27885 1726882536.41073: stdout chunk (state=3): >>><<< 27885 1726882536.41094: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882536.41103: _low_level_execute_command(): starting 27885 1726882536.41113: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882536.3119671-28334-66915951592834/AnsiballZ_dnf.py && sleep 0' 27885 1726882536.41691: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882536.41709: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882536.41723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882536.41739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882536.41757: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882536.41767: stderr chunk (state=3): >>>debug2: match not found <<< 27885 1726882536.41859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882536.41877: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882536.41890: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882536.41991: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882536.82407: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 27885 1726882536.86368: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 27885 1726882536.86388: stderr chunk (state=3): >>><<< 27885 1726882536.86396: stdout chunk (state=3): >>><<< 27885 1726882536.86418: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 27885 1726882536.86453: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882536.3119671-28334-66915951592834/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27885 1726882536.86461: _low_level_execute_command(): starting 27885 1726882536.86463: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882536.3119671-28334-66915951592834/ > /dev/null 2>&1 && sleep 0' 27885 1726882536.86884: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882536.86887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882536.86892: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882536.86896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882536.86936: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882536.86941: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882536.87020: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882536.88840: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882536.88866: stderr chunk (state=3): >>><<< 27885 1726882536.88869: stdout chunk (state=3): >>><<< 27885 1726882536.88881: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882536.88887: handler run complete 27885 1726882536.89002: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27885 1726882536.89127: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27885 1726882536.89155: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27885 1726882536.89200: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27885 1726882536.89225: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27885 1726882536.89275: variable '__install_status' from source: set_fact 27885 1726882536.89290: Evaluated conditional (__install_status is success): True 27885 1726882536.89307: attempt loop complete, returning result 27885 1726882536.89310: _execute() done 27885 1726882536.89313: dumping result to json 27885 1726882536.89318: done dumping result, returning 27885 1726882536.89324: done running TaskExecutor() for managed_node2/TASK: Install iproute [12673a56-9f93-3fa5-01be-0000000002ff] 27885 1726882536.89329: sending task result for task 12673a56-9f93-3fa5-01be-0000000002ff 27885 1726882536.89423: done sending task result for task 12673a56-9f93-3fa5-01be-0000000002ff 27885 1726882536.89426: WORKER PROCESS EXITING ok: [managed_node2] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 27885 1726882536.89504: no more pending results, returning what we have 27885 1726882536.89508: results queue empty 27885 1726882536.89509: checking for any_errors_fatal 27885 1726882536.89515: done checking for any_errors_fatal 27885 1726882536.89515: checking for max_fail_percentage 27885 1726882536.89517: done checking for max_fail_percentage 27885 1726882536.89517: checking to see if all hosts have failed and the running result is not ok 27885 1726882536.89518: done checking to see if all hosts have failed 27885 1726882536.89519: getting the remaining hosts for this loop 27885 1726882536.89520: done getting the remaining hosts for this loop 27885 1726882536.89524: getting the next task for host managed_node2 27885 1726882536.89529: done getting next task for host managed_node2 27885 1726882536.89532: ^ task is: TASK: Create veth interface {{ interface }} 27885 1726882536.89536: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882536.89540: getting variables 27885 1726882536.89541: in VariableManager get_vars() 27885 1726882536.89643: Calling all_inventory to load vars for managed_node2 27885 1726882536.89646: Calling groups_inventory to load vars for managed_node2 27885 1726882536.89648: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882536.89657: Calling all_plugins_play to load vars for managed_node2 27885 1726882536.89659: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882536.89662: Calling groups_plugins_play to load vars for managed_node2 27885 1726882536.89780: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882536.89910: done with get_vars() 27885 1726882536.89918: done getting variables 27885 1726882536.89959: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27885 1726882536.90043: variable 'interface' from source: set_fact TASK [Create veth interface ethtest1] ****************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Friday 20 September 2024 21:35:36 -0400 (0:00:00.653) 0:00:09.543 ****** 27885 1726882536.90064: entering _queue_task() for managed_node2/command 27885 1726882536.90256: worker is 1 (out of 1 available) 27885 1726882536.90270: exiting _queue_task() for managed_node2/command 27885 1726882536.90283: done queuing things up, now waiting for results queue to drain 27885 1726882536.90285: waiting for pending results... 27885 1726882536.90462: running TaskExecutor() for managed_node2/TASK: Create veth interface ethtest1 27885 1726882536.90600: in run() - task 12673a56-9f93-3fa5-01be-000000000300 27885 1726882536.90603: variable 'ansible_search_path' from source: unknown 27885 1726882536.90606: variable 'ansible_search_path' from source: unknown 27885 1726882536.90830: variable 'interface' from source: set_fact 27885 1726882536.90913: variable 'interface' from source: set_fact 27885 1726882536.90987: variable 'interface' from source: set_fact 27885 1726882536.91163: Loaded config def from plugin (lookup/items) 27885 1726882536.91190: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 27885 1726882536.91216: variable 'omit' from source: magic vars 27885 1726882536.91331: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882536.91399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882536.91403: variable 'omit' from source: magic vars 27885 1726882536.91901: variable 'ansible_distribution_major_version' from source: facts 27885 1726882536.91917: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882536.92142: variable 'type' from source: set_fact 27885 1726882536.92154: variable 'state' from source: include params 27885 1726882536.92165: variable 'interface' from source: set_fact 27885 1726882536.92175: variable 'current_interfaces' from source: set_fact 27885 1726882536.92188: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 27885 1726882536.92204: variable 'omit' from source: magic vars 27885 1726882536.92243: variable 'omit' from source: magic vars 27885 1726882536.92295: variable 'item' from source: unknown 27885 1726882536.92370: variable 'item' from source: unknown 27885 1726882536.92399: variable 'omit' from source: magic vars 27885 1726882536.92467: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882536.92471: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882536.92492: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882536.92518: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882536.92536: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882536.92570: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882536.92599: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882536.92603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882536.92695: Set connection var ansible_pipelining to False 27885 1726882536.92799: Set connection var ansible_connection to ssh 27885 1726882536.92803: Set connection var ansible_timeout to 10 27885 1726882536.92805: Set connection var ansible_shell_type to sh 27885 1726882536.92807: Set connection var ansible_shell_executable to /bin/sh 27885 1726882536.92809: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882536.92811: variable 'ansible_shell_executable' from source: unknown 27885 1726882536.92813: variable 'ansible_connection' from source: unknown 27885 1726882536.92815: variable 'ansible_module_compression' from source: unknown 27885 1726882536.92818: variable 'ansible_shell_type' from source: unknown 27885 1726882536.92820: variable 'ansible_shell_executable' from source: unknown 27885 1726882536.92822: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882536.92824: variable 'ansible_pipelining' from source: unknown 27885 1726882536.92826: variable 'ansible_timeout' from source: unknown 27885 1726882536.92827: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882536.92943: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882536.92987: variable 'omit' from source: magic vars 27885 1726882536.93002: starting attempt loop 27885 1726882536.93010: running the handler 27885 1726882536.93030: _low_level_execute_command(): starting 27885 1726882536.93041: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27885 1726882536.93927: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882536.94289: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882536.94309: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882536.94406: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882536.95979: stdout chunk (state=3): >>>/root <<< 27885 1726882536.96116: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882536.96129: stdout chunk (state=3): >>><<< 27885 1726882536.96144: stderr chunk (state=3): >>><<< 27885 1726882536.96173: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882536.96206: _low_level_execute_command(): starting 27885 1726882536.96220: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882536.9618845-28363-254979655592958 `" && echo ansible-tmp-1726882536.9618845-28363-254979655592958="` echo /root/.ansible/tmp/ansible-tmp-1726882536.9618845-28363-254979655592958 `" ) && sleep 0' 27885 1726882536.96838: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882536.97011: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882536.97028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882536.97047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882536.97076: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882536.97184: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882536.97245: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882536.97248: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882536.97321: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882536.99199: stdout chunk (state=3): >>>ansible-tmp-1726882536.9618845-28363-254979655592958=/root/.ansible/tmp/ansible-tmp-1726882536.9618845-28363-254979655592958 <<< 27885 1726882536.99334: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882536.99343: stdout chunk (state=3): >>><<< 27885 1726882536.99353: stderr chunk (state=3): >>><<< 27885 1726882536.99371: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882536.9618845-28363-254979655592958=/root/.ansible/tmp/ansible-tmp-1726882536.9618845-28363-254979655592958 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882536.99407: variable 'ansible_module_compression' from source: unknown 27885 1726882536.99458: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27885_1t3g5hz/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27885 1726882536.99598: variable 'ansible_facts' from source: unknown 27885 1726882536.99602: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882536.9618845-28363-254979655592958/AnsiballZ_command.py 27885 1726882536.99824: Sending initial data 27885 1726882536.99836: Sent initial data (156 bytes) 27885 1726882537.00323: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882537.00340: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882537.00355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882537.00382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882537.00478: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882537.00504: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882537.00598: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882537.02111: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27885 1726882537.02202: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27885 1726882537.02279: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpivxvy03k /root/.ansible/tmp/ansible-tmp-1726882536.9618845-28363-254979655592958/AnsiballZ_command.py <<< 27885 1726882537.02283: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882536.9618845-28363-254979655592958/AnsiballZ_command.py" <<< 27885 1726882537.02332: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpivxvy03k" to remote "/root/.ansible/tmp/ansible-tmp-1726882536.9618845-28363-254979655592958/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882536.9618845-28363-254979655592958/AnsiballZ_command.py" <<< 27885 1726882537.03205: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882537.03208: stdout chunk (state=3): >>><<< 27885 1726882537.03210: stderr chunk (state=3): >>><<< 27885 1726882537.03218: done transferring module to remote 27885 1726882537.03232: _low_level_execute_command(): starting 27885 1726882537.03239: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882536.9618845-28363-254979655592958/ /root/.ansible/tmp/ansible-tmp-1726882536.9618845-28363-254979655592958/AnsiballZ_command.py && sleep 0' 27885 1726882537.03934: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882537.03949: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882537.04026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882537.04087: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882537.04109: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882537.04144: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882537.04243: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882537.05998: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882537.06002: stdout chunk (state=3): >>><<< 27885 1726882537.06020: stderr chunk (state=3): >>><<< 27885 1726882537.06117: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882537.06125: _low_level_execute_command(): starting 27885 1726882537.06129: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882536.9618845-28363-254979655592958/AnsiballZ_command.py && sleep 0' 27885 1726882537.06668: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882537.06676: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882537.06686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882537.06706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882537.06718: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882537.06725: stderr chunk (state=3): >>>debug2: match not found <<< 27885 1726882537.06739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882537.06753: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27885 1726882537.06761: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 27885 1726882537.06767: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27885 1726882537.06775: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882537.06784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882537.06801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882537.06852: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882537.06887: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882537.06907: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882537.06941: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882537.07006: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882537.22606: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest1", "type", "veth", "peer", "name", "peerethtest1"], "start": "2024-09-20 21:35:37.216958", "end": "2024-09-20 21:35:37.224182", "delta": "0:00:00.007224", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest1 type veth peer name peerethtest1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27885 1726882537.24498: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882537.24522: stderr chunk (state=3): >>>Shared connection to 10.31.14.69 closed. <<< 27885 1726882537.24598: stderr chunk (state=3): >>><<< 27885 1726882537.24616: stdout chunk (state=3): >>><<< 27885 1726882537.24659: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest1", "type", "veth", "peer", "name", "peerethtest1"], "start": "2024-09-20 21:35:37.216958", "end": "2024-09-20 21:35:37.224182", "delta": "0:00:00.007224", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest1 type veth peer name peerethtest1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 27885 1726882537.24707: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add ethtest1 type veth peer name peerethtest1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882536.9618845-28363-254979655592958/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27885 1726882537.24722: _low_level_execute_command(): starting 27885 1726882537.24745: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882536.9618845-28363-254979655592958/ > /dev/null 2>&1 && sleep 0' 27885 1726882537.25526: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882537.25567: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882537.25582: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882537.25610: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882537.26180: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882537.29899: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882537.29902: stdout chunk (state=3): >>><<< 27885 1726882537.29909: stderr chunk (state=3): >>><<< 27885 1726882537.29912: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882537.29914: handler run complete 27885 1726882537.29916: Evaluated conditional (False): False 27885 1726882537.29918: attempt loop complete, returning result 27885 1726882537.29920: variable 'item' from source: unknown 27885 1726882537.29922: variable 'item' from source: unknown ok: [managed_node2] => (item=ip link add ethtest1 type veth peer name peerethtest1) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "ethtest1", "type", "veth", "peer", "name", "peerethtest1" ], "delta": "0:00:00.007224", "end": "2024-09-20 21:35:37.224182", "item": "ip link add ethtest1 type veth peer name peerethtest1", "rc": 0, "start": "2024-09-20 21:35:37.216958" } 27885 1726882537.30299: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882537.30302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882537.30304: variable 'omit' from source: magic vars 27885 1726882537.30371: variable 'ansible_distribution_major_version' from source: facts 27885 1726882537.30381: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882537.30578: variable 'type' from source: set_fact 27885 1726882537.30635: variable 'state' from source: include params 27885 1726882537.30638: variable 'interface' from source: set_fact 27885 1726882537.30640: variable 'current_interfaces' from source: set_fact 27885 1726882537.30642: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 27885 1726882537.30644: variable 'omit' from source: magic vars 27885 1726882537.30646: variable 'omit' from source: magic vars 27885 1726882537.30677: variable 'item' from source: unknown 27885 1726882537.30747: variable 'item' from source: unknown 27885 1726882537.30765: variable 'omit' from source: magic vars 27885 1726882537.30788: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882537.30807: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882537.30818: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882537.30852: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882537.30855: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882537.30857: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882537.30934: Set connection var ansible_pipelining to False 27885 1726882537.30961: Set connection var ansible_connection to ssh 27885 1726882537.30964: Set connection var ansible_timeout to 10 27885 1726882537.30966: Set connection var ansible_shell_type to sh 27885 1726882537.30968: Set connection var ansible_shell_executable to /bin/sh 27885 1726882537.31070: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882537.31073: variable 'ansible_shell_executable' from source: unknown 27885 1726882537.31076: variable 'ansible_connection' from source: unknown 27885 1726882537.31078: variable 'ansible_module_compression' from source: unknown 27885 1726882537.31080: variable 'ansible_shell_type' from source: unknown 27885 1726882537.31082: variable 'ansible_shell_executable' from source: unknown 27885 1726882537.31084: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882537.31086: variable 'ansible_pipelining' from source: unknown 27885 1726882537.31088: variable 'ansible_timeout' from source: unknown 27885 1726882537.31092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882537.31149: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882537.31162: variable 'omit' from source: magic vars 27885 1726882537.31171: starting attempt loop 27885 1726882537.31182: running the handler 27885 1726882537.31198: _low_level_execute_command(): starting 27885 1726882537.31207: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27885 1726882537.31850: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882537.31932: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882537.31979: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882537.32063: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882537.33629: stdout chunk (state=3): >>>/root <<< 27885 1726882537.33772: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882537.33775: stdout chunk (state=3): >>><<< 27885 1726882537.33778: stderr chunk (state=3): >>><<< 27885 1726882537.33792: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882537.33809: _low_level_execute_command(): starting 27885 1726882537.33856: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882537.3379927-28363-131871050252547 `" && echo ansible-tmp-1726882537.3379927-28363-131871050252547="` echo /root/.ansible/tmp/ansible-tmp-1726882537.3379927-28363-131871050252547 `" ) && sleep 0' 27885 1726882537.34265: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882537.34278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882537.34294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882537.34343: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882537.34346: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882537.34417: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882537.36272: stdout chunk (state=3): >>>ansible-tmp-1726882537.3379927-28363-131871050252547=/root/.ansible/tmp/ansible-tmp-1726882537.3379927-28363-131871050252547 <<< 27885 1726882537.36380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882537.36412: stderr chunk (state=3): >>><<< 27885 1726882537.36415: stdout chunk (state=3): >>><<< 27885 1726882537.36427: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882537.3379927-28363-131871050252547=/root/.ansible/tmp/ansible-tmp-1726882537.3379927-28363-131871050252547 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882537.36448: variable 'ansible_module_compression' from source: unknown 27885 1726882537.36478: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27885_1t3g5hz/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27885 1726882537.36498: variable 'ansible_facts' from source: unknown 27885 1726882537.36542: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882537.3379927-28363-131871050252547/AnsiballZ_command.py 27885 1726882537.36626: Sending initial data 27885 1726882537.36629: Sent initial data (156 bytes) 27885 1726882537.37023: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882537.37026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882537.37028: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882537.37030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882537.37077: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882537.37080: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882537.37145: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882537.38672: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27885 1726882537.38750: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27885 1726882537.38846: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmp7aom4axe /root/.ansible/tmp/ansible-tmp-1726882537.3379927-28363-131871050252547/AnsiballZ_command.py <<< 27885 1726882537.38849: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882537.3379927-28363-131871050252547/AnsiballZ_command.py" <<< 27885 1726882537.38929: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmp7aom4axe" to remote "/root/.ansible/tmp/ansible-tmp-1726882537.3379927-28363-131871050252547/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882537.3379927-28363-131871050252547/AnsiballZ_command.py" <<< 27885 1726882537.39787: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882537.39899: stderr chunk (state=3): >>><<< 27885 1726882537.39902: stdout chunk (state=3): >>><<< 27885 1726882537.39905: done transferring module to remote 27885 1726882537.39907: _low_level_execute_command(): starting 27885 1726882537.39909: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882537.3379927-28363-131871050252547/ /root/.ansible/tmp/ansible-tmp-1726882537.3379927-28363-131871050252547/AnsiballZ_command.py && sleep 0' 27885 1726882537.40290: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882537.40296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 27885 1726882537.40298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 27885 1726882537.40300: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882537.40302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882537.40351: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882537.40358: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882537.40420: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882537.42272: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882537.42276: stdout chunk (state=3): >>><<< 27885 1726882537.42278: stderr chunk (state=3): >>><<< 27885 1726882537.42280: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882537.42282: _low_level_execute_command(): starting 27885 1726882537.42284: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882537.3379927-28363-131871050252547/AnsiballZ_command.py && sleep 0' 27885 1726882537.42750: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882537.42760: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882537.42770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882537.42781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882537.42796: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882537.42806: stderr chunk (state=3): >>>debug2: match not found <<< 27885 1726882537.42814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882537.42887: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882537.42909: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882537.42969: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882537.58349: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest1", "up"], "start": "2024-09-20 21:35:37.578932", "end": "2024-09-20 21:35:37.582579", "delta": "0:00:00.003647", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest1 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27885 1726882537.59790: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 27885 1726882537.59798: stdout chunk (state=3): >>><<< 27885 1726882537.59801: stderr chunk (state=3): >>><<< 27885 1726882537.59940: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest1", "up"], "start": "2024-09-20 21:35:37.578932", "end": "2024-09-20 21:35:37.582579", "delta": "0:00:00.003647", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest1 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 27885 1726882537.59948: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerethtest1 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882537.3379927-28363-131871050252547/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27885 1726882537.59951: _low_level_execute_command(): starting 27885 1726882537.59954: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882537.3379927-28363-131871050252547/ > /dev/null 2>&1 && sleep 0' 27885 1726882537.60639: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882537.60655: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882537.60695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882537.60790: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882537.60813: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882537.60910: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882537.62899: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882537.62902: stderr chunk (state=3): >>><<< 27885 1726882537.62904: stdout chunk (state=3): >>><<< 27885 1726882537.62907: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882537.62909: handler run complete 27885 1726882537.62911: Evaluated conditional (False): False 27885 1726882537.62913: attempt loop complete, returning result 27885 1726882537.62914: variable 'item' from source: unknown 27885 1726882537.62916: variable 'item' from source: unknown ok: [managed_node2] => (item=ip link set peerethtest1 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerethtest1", "up" ], "delta": "0:00:00.003647", "end": "2024-09-20 21:35:37.582579", "item": "ip link set peerethtest1 up", "rc": 0, "start": "2024-09-20 21:35:37.578932" } 27885 1726882537.63015: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882537.63018: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882537.63020: variable 'omit' from source: magic vars 27885 1726882537.63178: variable 'ansible_distribution_major_version' from source: facts 27885 1726882537.63182: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882537.63388: variable 'type' from source: set_fact 27885 1726882537.63391: variable 'state' from source: include params 27885 1726882537.63401: variable 'interface' from source: set_fact 27885 1726882537.63404: variable 'current_interfaces' from source: set_fact 27885 1726882537.63412: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 27885 1726882537.63415: variable 'omit' from source: magic vars 27885 1726882537.63467: variable 'omit' from source: magic vars 27885 1726882537.63476: variable 'item' from source: unknown 27885 1726882537.63549: variable 'item' from source: unknown 27885 1726882537.63564: variable 'omit' from source: magic vars 27885 1726882537.63586: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882537.63681: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882537.63685: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882537.63687: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882537.63689: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882537.63691: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882537.63716: Set connection var ansible_pipelining to False 27885 1726882537.63719: Set connection var ansible_connection to ssh 27885 1726882537.63726: Set connection var ansible_timeout to 10 27885 1726882537.63728: Set connection var ansible_shell_type to sh 27885 1726882537.63734: Set connection var ansible_shell_executable to /bin/sh 27885 1726882537.63739: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882537.63769: variable 'ansible_shell_executable' from source: unknown 27885 1726882537.63772: variable 'ansible_connection' from source: unknown 27885 1726882537.63775: variable 'ansible_module_compression' from source: unknown 27885 1726882537.63777: variable 'ansible_shell_type' from source: unknown 27885 1726882537.63779: variable 'ansible_shell_executable' from source: unknown 27885 1726882537.63781: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882537.63783: variable 'ansible_pipelining' from source: unknown 27885 1726882537.63808: variable 'ansible_timeout' from source: unknown 27885 1726882537.63811: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882537.63900: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882537.63908: variable 'omit' from source: magic vars 27885 1726882537.63998: starting attempt loop 27885 1726882537.64001: running the handler 27885 1726882537.64003: _low_level_execute_command(): starting 27885 1726882537.64005: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27885 1726882537.64566: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882537.64575: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882537.64586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882537.64607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882537.64636: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration <<< 27885 1726882537.64648: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882537.64730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882537.64754: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882537.64841: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882537.66397: stdout chunk (state=3): >>>/root <<< 27885 1726882537.66546: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882537.66549: stdout chunk (state=3): >>><<< 27885 1726882537.66551: stderr chunk (state=3): >>><<< 27885 1726882537.66566: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882537.66649: _low_level_execute_command(): starting 27885 1726882537.66653: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882537.6657043-28363-172415241188346 `" && echo ansible-tmp-1726882537.6657043-28363-172415241188346="` echo /root/.ansible/tmp/ansible-tmp-1726882537.6657043-28363-172415241188346 `" ) && sleep 0' 27885 1726882537.67213: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882537.67312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882537.67326: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882537.67343: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882537.67364: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882537.67453: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882537.69309: stdout chunk (state=3): >>>ansible-tmp-1726882537.6657043-28363-172415241188346=/root/.ansible/tmp/ansible-tmp-1726882537.6657043-28363-172415241188346 <<< 27885 1726882537.69468: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882537.69471: stdout chunk (state=3): >>><<< 27885 1726882537.69473: stderr chunk (state=3): >>><<< 27885 1726882537.69492: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882537.6657043-28363-172415241188346=/root/.ansible/tmp/ansible-tmp-1726882537.6657043-28363-172415241188346 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882537.69700: variable 'ansible_module_compression' from source: unknown 27885 1726882537.69703: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27885_1t3g5hz/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27885 1726882537.69705: variable 'ansible_facts' from source: unknown 27885 1726882537.69707: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882537.6657043-28363-172415241188346/AnsiballZ_command.py 27885 1726882537.69843: Sending initial data 27885 1726882537.69847: Sent initial data (156 bytes) 27885 1726882537.70380: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882537.70477: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882537.70496: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882537.70513: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882537.70598: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882537.72299: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 27885 1726882537.72318: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27885 1726882537.72388: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27885 1726882537.72463: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpz5wyqfdi /root/.ansible/tmp/ansible-tmp-1726882537.6657043-28363-172415241188346/AnsiballZ_command.py <<< 27885 1726882537.72467: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882537.6657043-28363-172415241188346/AnsiballZ_command.py" <<< 27885 1726882537.72600: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpz5wyqfdi" to remote "/root/.ansible/tmp/ansible-tmp-1726882537.6657043-28363-172415241188346/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882537.6657043-28363-172415241188346/AnsiballZ_command.py" <<< 27885 1726882537.73637: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882537.73697: stderr chunk (state=3): >>><<< 27885 1726882537.73705: stdout chunk (state=3): >>><<< 27885 1726882537.73767: done transferring module to remote 27885 1726882537.73774: _low_level_execute_command(): starting 27885 1726882537.73780: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882537.6657043-28363-172415241188346/ /root/.ansible/tmp/ansible-tmp-1726882537.6657043-28363-172415241188346/AnsiballZ_command.py && sleep 0' 27885 1726882537.74379: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882537.74412: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882537.74427: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27885 1726882537.74508: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882537.74520: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882537.74533: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882537.74558: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882537.74640: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882537.76363: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882537.76411: stderr chunk (state=3): >>><<< 27885 1726882537.76414: stdout chunk (state=3): >>><<< 27885 1726882537.76433: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882537.76437: _low_level_execute_command(): starting 27885 1726882537.76441: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882537.6657043-28363-172415241188346/AnsiballZ_command.py && sleep 0' 27885 1726882537.77020: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882537.77028: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882537.77039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882537.77060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882537.77072: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882537.77156: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882537.77161: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882537.77178: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882537.77197: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882537.77286: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882537.92433: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest1", "up"], "start": "2024-09-20 21:35:37.919780", "end": "2024-09-20 21:35:37.923378", "delta": "0:00:00.003598", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest1 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}}<<< 27885 1726882537.92467: stdout chunk (state=3): >>> <<< 27885 1726882537.93998: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 27885 1726882537.94002: stdout chunk (state=3): >>><<< 27885 1726882537.94005: stderr chunk (state=3): >>><<< 27885 1726882537.94009: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest1", "up"], "start": "2024-09-20 21:35:37.919780", "end": "2024-09-20 21:35:37.923378", "delta": "0:00:00.003598", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest1 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 27885 1726882537.94011: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set ethtest1 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882537.6657043-28363-172415241188346/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27885 1726882537.94014: _low_level_execute_command(): starting 27885 1726882537.94016: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882537.6657043-28363-172415241188346/ > /dev/null 2>&1 && sleep 0' 27885 1726882537.94430: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882537.94433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882537.94467: stderr chunk (state=3): >>>debug2: match not found <<< 27885 1726882537.94472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 27885 1726882537.94475: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882537.94477: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882537.94531: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882537.94534: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882537.94600: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882537.96372: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882537.96397: stderr chunk (state=3): >>><<< 27885 1726882537.96406: stdout chunk (state=3): >>><<< 27885 1726882537.96422: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882537.96425: handler run complete 27885 1726882537.96440: Evaluated conditional (False): False 27885 1726882537.96447: attempt loop complete, returning result 27885 1726882537.96462: variable 'item' from source: unknown 27885 1726882537.96525: variable 'item' from source: unknown ok: [managed_node2] => (item=ip link set ethtest1 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "ethtest1", "up" ], "delta": "0:00:00.003598", "end": "2024-09-20 21:35:37.923378", "item": "ip link set ethtest1 up", "rc": 0, "start": "2024-09-20 21:35:37.919780" } 27885 1726882537.96639: dumping result to json 27885 1726882537.96642: done dumping result, returning 27885 1726882537.96643: done running TaskExecutor() for managed_node2/TASK: Create veth interface ethtest1 [12673a56-9f93-3fa5-01be-000000000300] 27885 1726882537.96645: sending task result for task 12673a56-9f93-3fa5-01be-000000000300 27885 1726882537.96688: done sending task result for task 12673a56-9f93-3fa5-01be-000000000300 27885 1726882537.96699: WORKER PROCESS EXITING 27885 1726882537.96755: no more pending results, returning what we have 27885 1726882537.96758: results queue empty 27885 1726882537.96759: checking for any_errors_fatal 27885 1726882537.96765: done checking for any_errors_fatal 27885 1726882537.96765: checking for max_fail_percentage 27885 1726882537.96767: done checking for max_fail_percentage 27885 1726882537.96768: checking to see if all hosts have failed and the running result is not ok 27885 1726882537.96769: done checking to see if all hosts have failed 27885 1726882537.96769: getting the remaining hosts for this loop 27885 1726882537.96771: done getting the remaining hosts for this loop 27885 1726882537.96774: getting the next task for host managed_node2 27885 1726882537.96784: done getting next task for host managed_node2 27885 1726882537.96786: ^ task is: TASK: Set up veth as managed by NetworkManager 27885 1726882537.96792: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882537.96798: getting variables 27885 1726882537.96799: in VariableManager get_vars() 27885 1726882537.96843: Calling all_inventory to load vars for managed_node2 27885 1726882537.96846: Calling groups_inventory to load vars for managed_node2 27885 1726882537.96848: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882537.96856: Calling all_plugins_play to load vars for managed_node2 27885 1726882537.96859: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882537.96861: Calling groups_plugins_play to load vars for managed_node2 27885 1726882537.97036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882537.97169: done with get_vars() 27885 1726882537.97181: done getting variables 27885 1726882537.97243: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Friday 20 September 2024 21:35:37 -0400 (0:00:01.072) 0:00:10.615 ****** 27885 1726882537.97272: entering _queue_task() for managed_node2/command 27885 1726882537.97539: worker is 1 (out of 1 available) 27885 1726882537.97552: exiting _queue_task() for managed_node2/command 27885 1726882537.97566: done queuing things up, now waiting for results queue to drain 27885 1726882537.97568: waiting for pending results... 27885 1726882537.97848: running TaskExecutor() for managed_node2/TASK: Set up veth as managed by NetworkManager 27885 1726882537.97960: in run() - task 12673a56-9f93-3fa5-01be-000000000301 27885 1726882537.98003: variable 'ansible_search_path' from source: unknown 27885 1726882537.98007: variable 'ansible_search_path' from source: unknown 27885 1726882537.98099: calling self._execute() 27885 1726882537.98142: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882537.98156: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882537.98171: variable 'omit' from source: magic vars 27885 1726882537.98576: variable 'ansible_distribution_major_version' from source: facts 27885 1726882537.98602: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882537.98759: variable 'type' from source: set_fact 27885 1726882537.98769: variable 'state' from source: include params 27885 1726882537.98785: Evaluated conditional (type == 'veth' and state == 'present'): True 27885 1726882537.98801: variable 'omit' from source: magic vars 27885 1726882537.98998: variable 'omit' from source: magic vars 27885 1726882537.99001: variable 'interface' from source: set_fact 27885 1726882537.99003: variable 'omit' from source: magic vars 27885 1726882537.99005: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882537.99046: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882537.99072: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882537.99099: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882537.99122: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882537.99157: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882537.99167: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882537.99175: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882537.99294: Set connection var ansible_pipelining to False 27885 1726882537.99309: Set connection var ansible_connection to ssh 27885 1726882537.99321: Set connection var ansible_timeout to 10 27885 1726882537.99333: Set connection var ansible_shell_type to sh 27885 1726882537.99349: Set connection var ansible_shell_executable to /bin/sh 27885 1726882537.99360: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882537.99392: variable 'ansible_shell_executable' from source: unknown 27885 1726882537.99404: variable 'ansible_connection' from source: unknown 27885 1726882537.99413: variable 'ansible_module_compression' from source: unknown 27885 1726882537.99420: variable 'ansible_shell_type' from source: unknown 27885 1726882537.99428: variable 'ansible_shell_executable' from source: unknown 27885 1726882537.99449: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882537.99452: variable 'ansible_pipelining' from source: unknown 27885 1726882537.99454: variable 'ansible_timeout' from source: unknown 27885 1726882537.99558: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882537.99622: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882537.99639: variable 'omit' from source: magic vars 27885 1726882537.99651: starting attempt loop 27885 1726882537.99661: running the handler 27885 1726882537.99686: _low_level_execute_command(): starting 27885 1726882537.99777: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27885 1726882538.00519: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882538.00565: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882538.00602: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882538.00695: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882538.02246: stdout chunk (state=3): >>>/root <<< 27885 1726882538.02338: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882538.02372: stderr chunk (state=3): >>><<< 27885 1726882538.02376: stdout chunk (state=3): >>><<< 27885 1726882538.02394: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882538.02410: _low_level_execute_command(): starting 27885 1726882538.02416: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882538.0239732-28440-175069534365388 `" && echo ansible-tmp-1726882538.0239732-28440-175069534365388="` echo /root/.ansible/tmp/ansible-tmp-1726882538.0239732-28440-175069534365388 `" ) && sleep 0' 27885 1726882538.02845: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882538.02849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882538.02859: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration <<< 27885 1726882538.02862: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882538.02864: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882538.02912: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882538.02919: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882538.02979: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882538.04818: stdout chunk (state=3): >>>ansible-tmp-1726882538.0239732-28440-175069534365388=/root/.ansible/tmp/ansible-tmp-1726882538.0239732-28440-175069534365388 <<< 27885 1726882538.04927: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882538.04949: stderr chunk (state=3): >>><<< 27885 1726882538.04952: stdout chunk (state=3): >>><<< 27885 1726882538.04966: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882538.0239732-28440-175069534365388=/root/.ansible/tmp/ansible-tmp-1726882538.0239732-28440-175069534365388 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882538.05000: variable 'ansible_module_compression' from source: unknown 27885 1726882538.05040: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27885_1t3g5hz/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27885 1726882538.05066: variable 'ansible_facts' from source: unknown 27885 1726882538.05127: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882538.0239732-28440-175069534365388/AnsiballZ_command.py 27885 1726882538.05226: Sending initial data 27885 1726882538.05230: Sent initial data (156 bytes) 27885 1726882538.05656: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882538.05659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882538.05661: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882538.05663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 27885 1726882538.05667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882538.05715: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882538.05719: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882538.05792: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882538.07340: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27885 1726882538.07411: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27885 1726882538.07485: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpgy1kttqm /root/.ansible/tmp/ansible-tmp-1726882538.0239732-28440-175069534365388/AnsiballZ_command.py <<< 27885 1726882538.07505: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882538.0239732-28440-175069534365388/AnsiballZ_command.py" <<< 27885 1726882538.07565: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpgy1kttqm" to remote "/root/.ansible/tmp/ansible-tmp-1726882538.0239732-28440-175069534365388/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882538.0239732-28440-175069534365388/AnsiballZ_command.py" <<< 27885 1726882538.08597: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882538.08600: stderr chunk (state=3): >>><<< 27885 1726882538.08603: stdout chunk (state=3): >>><<< 27885 1726882538.08605: done transferring module to remote 27885 1726882538.08607: _low_level_execute_command(): starting 27885 1726882538.08608: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882538.0239732-28440-175069534365388/ /root/.ansible/tmp/ansible-tmp-1726882538.0239732-28440-175069534365388/AnsiballZ_command.py && sleep 0' 27885 1726882538.09190: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882538.09220: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882538.09234: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882538.09330: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882538.11065: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882538.11083: stdout chunk (state=3): >>><<< 27885 1726882538.11097: stderr chunk (state=3): >>><<< 27885 1726882538.11117: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882538.11126: _low_level_execute_command(): starting 27885 1726882538.11201: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882538.0239732-28440-175069534365388/AnsiballZ_command.py && sleep 0' 27885 1726882538.11727: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882538.11746: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882538.11762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882538.11780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882538.11862: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882538.11898: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882538.11915: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882538.11934: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882538.12021: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882538.28971: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest1", "managed", "true"], "start": "2024-09-20 21:35:38.268061", "end": "2024-09-20 21:35:38.288775", "delta": "0:00:00.020714", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest1 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27885 1726882538.30776: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 27885 1726882538.30780: stdout chunk (state=3): >>><<< 27885 1726882538.30782: stderr chunk (state=3): >>><<< 27885 1726882538.30784: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest1", "managed", "true"], "start": "2024-09-20 21:35:38.268061", "end": "2024-09-20 21:35:38.288775", "delta": "0:00:00.020714", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest1 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 27885 1726882538.30788: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set ethtest1 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882538.0239732-28440-175069534365388/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27885 1726882538.30794: _low_level_execute_command(): starting 27885 1726882538.30797: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882538.0239732-28440-175069534365388/ > /dev/null 2>&1 && sleep 0' 27885 1726882538.31730: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882538.31734: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration <<< 27885 1726882538.31736: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 27885 1726882538.31743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882538.31794: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882538.31798: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882538.32365: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882538.32386: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882538.34335: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882538.34428: stderr chunk (state=3): >>><<< 27885 1726882538.34432: stdout chunk (state=3): >>><<< 27885 1726882538.34434: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882538.34437: handler run complete 27885 1726882538.34443: Evaluated conditional (False): False 27885 1726882538.34459: attempt loop complete, returning result 27885 1726882538.34466: _execute() done 27885 1726882538.34645: dumping result to json 27885 1726882538.34648: done dumping result, returning 27885 1726882538.34650: done running TaskExecutor() for managed_node2/TASK: Set up veth as managed by NetworkManager [12673a56-9f93-3fa5-01be-000000000301] 27885 1726882538.34653: sending task result for task 12673a56-9f93-3fa5-01be-000000000301 27885 1726882538.34732: done sending task result for task 12673a56-9f93-3fa5-01be-000000000301 27885 1726882538.34735: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "nmcli", "d", "set", "ethtest1", "managed", "true" ], "delta": "0:00:00.020714", "end": "2024-09-20 21:35:38.288775", "rc": 0, "start": "2024-09-20 21:35:38.268061" } 27885 1726882538.34820: no more pending results, returning what we have 27885 1726882538.34824: results queue empty 27885 1726882538.34825: checking for any_errors_fatal 27885 1726882538.34842: done checking for any_errors_fatal 27885 1726882538.34843: checking for max_fail_percentage 27885 1726882538.34844: done checking for max_fail_percentage 27885 1726882538.34845: checking to see if all hosts have failed and the running result is not ok 27885 1726882538.34846: done checking to see if all hosts have failed 27885 1726882538.34847: getting the remaining hosts for this loop 27885 1726882538.34849: done getting the remaining hosts for this loop 27885 1726882538.34852: getting the next task for host managed_node2 27885 1726882538.34858: done getting next task for host managed_node2 27885 1726882538.34861: ^ task is: TASK: Delete veth interface {{ interface }} 27885 1726882538.34865: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882538.34870: getting variables 27885 1726882538.34872: in VariableManager get_vars() 27885 1726882538.35033: Calling all_inventory to load vars for managed_node2 27885 1726882538.35036: Calling groups_inventory to load vars for managed_node2 27885 1726882538.35038: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882538.35049: Calling all_plugins_play to load vars for managed_node2 27885 1726882538.35052: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882538.35055: Calling groups_plugins_play to load vars for managed_node2 27885 1726882538.35755: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882538.36337: done with get_vars() 27885 1726882538.36349: done getting variables 27885 1726882538.36529: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27885 1726882538.36759: variable 'interface' from source: set_fact TASK [Delete veth interface ethtest1] ****************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Friday 20 September 2024 21:35:38 -0400 (0:00:00.395) 0:00:11.010 ****** 27885 1726882538.36795: entering _queue_task() for managed_node2/command 27885 1726882538.37478: worker is 1 (out of 1 available) 27885 1726882538.37496: exiting _queue_task() for managed_node2/command 27885 1726882538.37509: done queuing things up, now waiting for results queue to drain 27885 1726882538.37510: waiting for pending results... 27885 1726882538.37913: running TaskExecutor() for managed_node2/TASK: Delete veth interface ethtest1 27885 1726882538.37953: in run() - task 12673a56-9f93-3fa5-01be-000000000302 27885 1726882538.37967: variable 'ansible_search_path' from source: unknown 27885 1726882538.37971: variable 'ansible_search_path' from source: unknown 27885 1726882538.38210: calling self._execute() 27885 1726882538.38283: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882538.38288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882538.38317: variable 'omit' from source: magic vars 27885 1726882538.39198: variable 'ansible_distribution_major_version' from source: facts 27885 1726882538.39201: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882538.39598: variable 'type' from source: set_fact 27885 1726882538.39601: variable 'state' from source: include params 27885 1726882538.39604: variable 'interface' from source: set_fact 27885 1726882538.39607: variable 'current_interfaces' from source: set_fact 27885 1726882538.39610: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 27885 1726882538.39612: when evaluation is False, skipping this task 27885 1726882538.39614: _execute() done 27885 1726882538.39617: dumping result to json 27885 1726882538.39619: done dumping result, returning 27885 1726882538.39622: done running TaskExecutor() for managed_node2/TASK: Delete veth interface ethtest1 [12673a56-9f93-3fa5-01be-000000000302] 27885 1726882538.39625: sending task result for task 12673a56-9f93-3fa5-01be-000000000302 skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 27885 1726882538.39744: no more pending results, returning what we have 27885 1726882538.39749: results queue empty 27885 1726882538.39751: checking for any_errors_fatal 27885 1726882538.39759: done checking for any_errors_fatal 27885 1726882538.39760: checking for max_fail_percentage 27885 1726882538.39762: done checking for max_fail_percentage 27885 1726882538.39763: checking to see if all hosts have failed and the running result is not ok 27885 1726882538.39764: done checking to see if all hosts have failed 27885 1726882538.39764: getting the remaining hosts for this loop 27885 1726882538.39766: done getting the remaining hosts for this loop 27885 1726882538.39770: getting the next task for host managed_node2 27885 1726882538.39777: done getting next task for host managed_node2 27885 1726882538.39780: ^ task is: TASK: Create dummy interface {{ interface }} 27885 1726882538.39785: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882538.39794: getting variables 27885 1726882538.39796: in VariableManager get_vars() 27885 1726882538.39841: Calling all_inventory to load vars for managed_node2 27885 1726882538.39845: Calling groups_inventory to load vars for managed_node2 27885 1726882538.39848: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882538.39861: Calling all_plugins_play to load vars for managed_node2 27885 1726882538.39865: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882538.39868: Calling groups_plugins_play to load vars for managed_node2 27885 1726882538.40459: done sending task result for task 12673a56-9f93-3fa5-01be-000000000302 27885 1726882538.40463: WORKER PROCESS EXITING 27885 1726882538.40514: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882538.40925: done with get_vars() 27885 1726882538.40935: done getting variables 27885 1726882538.40992: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27885 1726882538.41303: variable 'interface' from source: set_fact TASK [Create dummy interface ethtest1] ***************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Friday 20 September 2024 21:35:38 -0400 (0:00:00.045) 0:00:11.055 ****** 27885 1726882538.41333: entering _queue_task() for managed_node2/command 27885 1726882538.41792: worker is 1 (out of 1 available) 27885 1726882538.41805: exiting _queue_task() for managed_node2/command 27885 1726882538.41817: done queuing things up, now waiting for results queue to drain 27885 1726882538.41818: waiting for pending results... 27885 1726882538.42264: running TaskExecutor() for managed_node2/TASK: Create dummy interface ethtest1 27885 1726882538.42601: in run() - task 12673a56-9f93-3fa5-01be-000000000303 27885 1726882538.42605: variable 'ansible_search_path' from source: unknown 27885 1726882538.42608: variable 'ansible_search_path' from source: unknown 27885 1726882538.42625: calling self._execute() 27885 1726882538.42827: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882538.42840: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882538.42854: variable 'omit' from source: magic vars 27885 1726882538.43375: variable 'ansible_distribution_major_version' from source: facts 27885 1726882538.43485: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882538.43904: variable 'type' from source: set_fact 27885 1726882538.43998: variable 'state' from source: include params 27885 1726882538.44001: variable 'interface' from source: set_fact 27885 1726882538.44003: variable 'current_interfaces' from source: set_fact 27885 1726882538.44006: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 27885 1726882538.44008: when evaluation is False, skipping this task 27885 1726882538.44010: _execute() done 27885 1726882538.44012: dumping result to json 27885 1726882538.44017: done dumping result, returning 27885 1726882538.44020: done running TaskExecutor() for managed_node2/TASK: Create dummy interface ethtest1 [12673a56-9f93-3fa5-01be-000000000303] 27885 1726882538.44022: sending task result for task 12673a56-9f93-3fa5-01be-000000000303 27885 1726882538.44084: done sending task result for task 12673a56-9f93-3fa5-01be-000000000303 27885 1726882538.44088: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 27885 1726882538.44138: no more pending results, returning what we have 27885 1726882538.44142: results queue empty 27885 1726882538.44144: checking for any_errors_fatal 27885 1726882538.44151: done checking for any_errors_fatal 27885 1726882538.44152: checking for max_fail_percentage 27885 1726882538.44153: done checking for max_fail_percentage 27885 1726882538.44154: checking to see if all hosts have failed and the running result is not ok 27885 1726882538.44155: done checking to see if all hosts have failed 27885 1726882538.44156: getting the remaining hosts for this loop 27885 1726882538.44158: done getting the remaining hosts for this loop 27885 1726882538.44162: getting the next task for host managed_node2 27885 1726882538.44169: done getting next task for host managed_node2 27885 1726882538.44172: ^ task is: TASK: Delete dummy interface {{ interface }} 27885 1726882538.44176: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882538.44180: getting variables 27885 1726882538.44182: in VariableManager get_vars() 27885 1726882538.44229: Calling all_inventory to load vars for managed_node2 27885 1726882538.44232: Calling groups_inventory to load vars for managed_node2 27885 1726882538.44234: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882538.44247: Calling all_plugins_play to load vars for managed_node2 27885 1726882538.44250: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882538.44253: Calling groups_plugins_play to load vars for managed_node2 27885 1726882538.44849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882538.45191: done with get_vars() 27885 1726882538.45203: done getting variables 27885 1726882538.45263: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27885 1726882538.45362: variable 'interface' from source: set_fact TASK [Delete dummy interface ethtest1] ***************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Friday 20 September 2024 21:35:38 -0400 (0:00:00.040) 0:00:11.096 ****** 27885 1726882538.45387: entering _queue_task() for managed_node2/command 27885 1726882538.45618: worker is 1 (out of 1 available) 27885 1726882538.45629: exiting _queue_task() for managed_node2/command 27885 1726882538.45641: done queuing things up, now waiting for results queue to drain 27885 1726882538.45642: waiting for pending results... 27885 1726882538.45997: running TaskExecutor() for managed_node2/TASK: Delete dummy interface ethtest1 27885 1726882538.46003: in run() - task 12673a56-9f93-3fa5-01be-000000000304 27885 1726882538.46006: variable 'ansible_search_path' from source: unknown 27885 1726882538.46015: variable 'ansible_search_path' from source: unknown 27885 1726882538.46053: calling self._execute() 27885 1726882538.46140: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882538.46153: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882538.46166: variable 'omit' from source: magic vars 27885 1726882538.46505: variable 'ansible_distribution_major_version' from source: facts 27885 1726882538.46525: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882538.46745: variable 'type' from source: set_fact 27885 1726882538.46755: variable 'state' from source: include params 27885 1726882538.46849: variable 'interface' from source: set_fact 27885 1726882538.46852: variable 'current_interfaces' from source: set_fact 27885 1726882538.46855: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 27885 1726882538.46857: when evaluation is False, skipping this task 27885 1726882538.46859: _execute() done 27885 1726882538.46861: dumping result to json 27885 1726882538.46863: done dumping result, returning 27885 1726882538.46865: done running TaskExecutor() for managed_node2/TASK: Delete dummy interface ethtest1 [12673a56-9f93-3fa5-01be-000000000304] 27885 1726882538.46867: sending task result for task 12673a56-9f93-3fa5-01be-000000000304 27885 1726882538.46932: done sending task result for task 12673a56-9f93-3fa5-01be-000000000304 27885 1726882538.46935: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 27885 1726882538.47004: no more pending results, returning what we have 27885 1726882538.47008: results queue empty 27885 1726882538.47010: checking for any_errors_fatal 27885 1726882538.47016: done checking for any_errors_fatal 27885 1726882538.47017: checking for max_fail_percentage 27885 1726882538.47018: done checking for max_fail_percentage 27885 1726882538.47019: checking to see if all hosts have failed and the running result is not ok 27885 1726882538.47020: done checking to see if all hosts have failed 27885 1726882538.47021: getting the remaining hosts for this loop 27885 1726882538.47023: done getting the remaining hosts for this loop 27885 1726882538.47026: getting the next task for host managed_node2 27885 1726882538.47033: done getting next task for host managed_node2 27885 1726882538.47036: ^ task is: TASK: Create tap interface {{ interface }} 27885 1726882538.47040: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882538.47046: getting variables 27885 1726882538.47047: in VariableManager get_vars() 27885 1726882538.47087: Calling all_inventory to load vars for managed_node2 27885 1726882538.47090: Calling groups_inventory to load vars for managed_node2 27885 1726882538.47094: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882538.47106: Calling all_plugins_play to load vars for managed_node2 27885 1726882538.47109: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882538.47112: Calling groups_plugins_play to load vars for managed_node2 27885 1726882538.47558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882538.47770: done with get_vars() 27885 1726882538.47779: done getting variables 27885 1726882538.47834: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27885 1726882538.47940: variable 'interface' from source: set_fact TASK [Create tap interface ethtest1] ******************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Friday 20 September 2024 21:35:38 -0400 (0:00:00.025) 0:00:11.122 ****** 27885 1726882538.47967: entering _queue_task() for managed_node2/command 27885 1726882538.48212: worker is 1 (out of 1 available) 27885 1726882538.48224: exiting _queue_task() for managed_node2/command 27885 1726882538.48236: done queuing things up, now waiting for results queue to drain 27885 1726882538.48238: waiting for pending results... 27885 1726882538.48499: running TaskExecutor() for managed_node2/TASK: Create tap interface ethtest1 27885 1726882538.48674: in run() - task 12673a56-9f93-3fa5-01be-000000000305 27885 1726882538.48696: variable 'ansible_search_path' from source: unknown 27885 1726882538.48799: variable 'ansible_search_path' from source: unknown 27885 1726882538.48802: calling self._execute() 27885 1726882538.48823: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882538.48832: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882538.48841: variable 'omit' from source: magic vars 27885 1726882538.49352: variable 'ansible_distribution_major_version' from source: facts 27885 1726882538.49371: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882538.49909: variable 'type' from source: set_fact 27885 1726882538.49912: variable 'state' from source: include params 27885 1726882538.49915: variable 'interface' from source: set_fact 27885 1726882538.49917: variable 'current_interfaces' from source: set_fact 27885 1726882538.49919: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 27885 1726882538.49921: when evaluation is False, skipping this task 27885 1726882538.49922: _execute() done 27885 1726882538.49924: dumping result to json 27885 1726882538.49926: done dumping result, returning 27885 1726882538.49928: done running TaskExecutor() for managed_node2/TASK: Create tap interface ethtest1 [12673a56-9f93-3fa5-01be-000000000305] 27885 1726882538.49930: sending task result for task 12673a56-9f93-3fa5-01be-000000000305 27885 1726882538.49991: done sending task result for task 12673a56-9f93-3fa5-01be-000000000305 27885 1726882538.49997: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 27885 1726882538.50063: no more pending results, returning what we have 27885 1726882538.50067: results queue empty 27885 1726882538.50068: checking for any_errors_fatal 27885 1726882538.50078: done checking for any_errors_fatal 27885 1726882538.50079: checking for max_fail_percentage 27885 1726882538.50080: done checking for max_fail_percentage 27885 1726882538.50081: checking to see if all hosts have failed and the running result is not ok 27885 1726882538.50082: done checking to see if all hosts have failed 27885 1726882538.50083: getting the remaining hosts for this loop 27885 1726882538.50084: done getting the remaining hosts for this loop 27885 1726882538.50088: getting the next task for host managed_node2 27885 1726882538.50097: done getting next task for host managed_node2 27885 1726882538.50101: ^ task is: TASK: Delete tap interface {{ interface }} 27885 1726882538.50105: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882538.50110: getting variables 27885 1726882538.50112: in VariableManager get_vars() 27885 1726882538.50153: Calling all_inventory to load vars for managed_node2 27885 1726882538.50155: Calling groups_inventory to load vars for managed_node2 27885 1726882538.50158: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882538.50170: Calling all_plugins_play to load vars for managed_node2 27885 1726882538.50172: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882538.50174: Calling groups_plugins_play to load vars for managed_node2 27885 1726882538.50688: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882538.51060: done with get_vars() 27885 1726882538.51071: done getting variables 27885 1726882538.51132: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27885 1726882538.51246: variable 'interface' from source: set_fact TASK [Delete tap interface ethtest1] ******************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Friday 20 September 2024 21:35:38 -0400 (0:00:00.033) 0:00:11.155 ****** 27885 1726882538.51275: entering _queue_task() for managed_node2/command 27885 1726882538.51528: worker is 1 (out of 1 available) 27885 1726882538.51541: exiting _queue_task() for managed_node2/command 27885 1726882538.51554: done queuing things up, now waiting for results queue to drain 27885 1726882538.51556: waiting for pending results... 27885 1726882538.52009: running TaskExecutor() for managed_node2/TASK: Delete tap interface ethtest1 27885 1726882538.52014: in run() - task 12673a56-9f93-3fa5-01be-000000000306 27885 1726882538.52019: variable 'ansible_search_path' from source: unknown 27885 1726882538.52021: variable 'ansible_search_path' from source: unknown 27885 1726882538.52024: calling self._execute() 27885 1726882538.52067: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882538.52080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882538.52095: variable 'omit' from source: magic vars 27885 1726882538.52442: variable 'ansible_distribution_major_version' from source: facts 27885 1726882538.52463: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882538.52821: variable 'type' from source: set_fact 27885 1726882538.52833: variable 'state' from source: include params 27885 1726882538.52845: variable 'interface' from source: set_fact 27885 1726882538.52854: variable 'current_interfaces' from source: set_fact 27885 1726882538.52866: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 27885 1726882538.52874: when evaluation is False, skipping this task 27885 1726882538.52881: _execute() done 27885 1726882538.52889: dumping result to json 27885 1726882538.52901: done dumping result, returning 27885 1726882538.52912: done running TaskExecutor() for managed_node2/TASK: Delete tap interface ethtest1 [12673a56-9f93-3fa5-01be-000000000306] 27885 1726882538.53117: sending task result for task 12673a56-9f93-3fa5-01be-000000000306 27885 1726882538.53182: done sending task result for task 12673a56-9f93-3fa5-01be-000000000306 27885 1726882538.53185: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 27885 1726882538.53266: no more pending results, returning what we have 27885 1726882538.53271: results queue empty 27885 1726882538.53272: checking for any_errors_fatal 27885 1726882538.53278: done checking for any_errors_fatal 27885 1726882538.53279: checking for max_fail_percentage 27885 1726882538.53281: done checking for max_fail_percentage 27885 1726882538.53282: checking to see if all hosts have failed and the running result is not ok 27885 1726882538.53282: done checking to see if all hosts have failed 27885 1726882538.53283: getting the remaining hosts for this loop 27885 1726882538.53285: done getting the remaining hosts for this loop 27885 1726882538.53289: getting the next task for host managed_node2 27885 1726882538.53303: done getting next task for host managed_node2 27885 1726882538.53307: ^ task is: TASK: Assert device is present 27885 1726882538.53312: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882538.53316: getting variables 27885 1726882538.53318: in VariableManager get_vars() 27885 1726882538.53362: Calling all_inventory to load vars for managed_node2 27885 1726882538.53366: Calling groups_inventory to load vars for managed_node2 27885 1726882538.53369: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882538.53382: Calling all_plugins_play to load vars for managed_node2 27885 1726882538.53385: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882538.53388: Calling groups_plugins_play to load vars for managed_node2 27885 1726882538.54118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882538.54442: done with get_vars() 27885 1726882538.54452: done getting variables TASK [Assert device is present] ************************************************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:32 Friday 20 September 2024 21:35:38 -0400 (0:00:00.032) 0:00:11.187 ****** 27885 1726882538.54540: entering _queue_task() for managed_node2/include_tasks 27885 1726882538.54904: worker is 1 (out of 1 available) 27885 1726882538.54917: exiting _queue_task() for managed_node2/include_tasks 27885 1726882538.54928: done queuing things up, now waiting for results queue to drain 27885 1726882538.54930: waiting for pending results... 27885 1726882538.55211: running TaskExecutor() for managed_node2/TASK: Assert device is present 27885 1726882538.55216: in run() - task 12673a56-9f93-3fa5-01be-000000000012 27885 1726882538.55229: variable 'ansible_search_path' from source: unknown 27885 1726882538.55268: calling self._execute() 27885 1726882538.55358: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882538.55370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882538.55383: variable 'omit' from source: magic vars 27885 1726882538.55739: variable 'ansible_distribution_major_version' from source: facts 27885 1726882538.55759: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882538.55771: _execute() done 27885 1726882538.55779: dumping result to json 27885 1726882538.55788: done dumping result, returning 27885 1726882538.55800: done running TaskExecutor() for managed_node2/TASK: Assert device is present [12673a56-9f93-3fa5-01be-000000000012] 27885 1726882538.55811: sending task result for task 12673a56-9f93-3fa5-01be-000000000012 27885 1726882538.56100: done sending task result for task 12673a56-9f93-3fa5-01be-000000000012 27885 1726882538.56104: WORKER PROCESS EXITING 27885 1726882538.56126: no more pending results, returning what we have 27885 1726882538.56131: in VariableManager get_vars() 27885 1726882538.56170: Calling all_inventory to load vars for managed_node2 27885 1726882538.56173: Calling groups_inventory to load vars for managed_node2 27885 1726882538.56175: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882538.56184: Calling all_plugins_play to load vars for managed_node2 27885 1726882538.56186: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882538.56189: Calling groups_plugins_play to load vars for managed_node2 27885 1726882538.56397: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882538.56596: done with get_vars() 27885 1726882538.56605: variable 'ansible_search_path' from source: unknown 27885 1726882538.56617: we have included files to process 27885 1726882538.56618: generating all_blocks data 27885 1726882538.56620: done generating all_blocks data 27885 1726882538.56626: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 27885 1726882538.56627: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 27885 1726882538.56629: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 27885 1726882538.56730: in VariableManager get_vars() 27885 1726882538.56752: done with get_vars() 27885 1726882538.56859: done processing included file 27885 1726882538.56861: iterating over new_blocks loaded from include file 27885 1726882538.56863: in VariableManager get_vars() 27885 1726882538.57118: done with get_vars() 27885 1726882538.57120: filtering new block on tags 27885 1726882538.57137: done filtering new block on tags 27885 1726882538.57140: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node2 27885 1726882538.57144: extending task lists for all hosts with included blocks 27885 1726882538.58166: done extending task lists 27885 1726882538.58167: done processing included files 27885 1726882538.58168: results queue empty 27885 1726882538.58169: checking for any_errors_fatal 27885 1726882538.58171: done checking for any_errors_fatal 27885 1726882538.58172: checking for max_fail_percentage 27885 1726882538.58173: done checking for max_fail_percentage 27885 1726882538.58174: checking to see if all hosts have failed and the running result is not ok 27885 1726882538.58174: done checking to see if all hosts have failed 27885 1726882538.58175: getting the remaining hosts for this loop 27885 1726882538.58176: done getting the remaining hosts for this loop 27885 1726882538.58178: getting the next task for host managed_node2 27885 1726882538.58182: done getting next task for host managed_node2 27885 1726882538.58184: ^ task is: TASK: Include the task 'get_interface_stat.yml' 27885 1726882538.58186: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882538.58188: getting variables 27885 1726882538.58189: in VariableManager get_vars() 27885 1726882538.58203: Calling all_inventory to load vars for managed_node2 27885 1726882538.58206: Calling groups_inventory to load vars for managed_node2 27885 1726882538.58207: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882538.58212: Calling all_plugins_play to load vars for managed_node2 27885 1726882538.58215: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882538.58217: Calling groups_plugins_play to load vars for managed_node2 27885 1726882538.58376: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882538.58577: done with get_vars() 27885 1726882538.58586: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:35:38 -0400 (0:00:00.041) 0:00:11.229 ****** 27885 1726882538.58655: entering _queue_task() for managed_node2/include_tasks 27885 1726882538.59124: worker is 1 (out of 1 available) 27885 1726882538.59131: exiting _queue_task() for managed_node2/include_tasks 27885 1726882538.59142: done queuing things up, now waiting for results queue to drain 27885 1726882538.59143: waiting for pending results... 27885 1726882538.59183: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 27885 1726882538.59279: in run() - task 12673a56-9f93-3fa5-01be-0000000003eb 27885 1726882538.59300: variable 'ansible_search_path' from source: unknown 27885 1726882538.59308: variable 'ansible_search_path' from source: unknown 27885 1726882538.59346: calling self._execute() 27885 1726882538.59434: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882538.59445: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882538.59458: variable 'omit' from source: magic vars 27885 1726882538.59821: variable 'ansible_distribution_major_version' from source: facts 27885 1726882538.59837: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882538.59847: _execute() done 27885 1726882538.59854: dumping result to json 27885 1726882538.59861: done dumping result, returning 27885 1726882538.59908: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [12673a56-9f93-3fa5-01be-0000000003eb] 27885 1726882538.59911: sending task result for task 12673a56-9f93-3fa5-01be-0000000003eb 27885 1726882538.59972: done sending task result for task 12673a56-9f93-3fa5-01be-0000000003eb 27885 1726882538.59975: WORKER PROCESS EXITING 27885 1726882538.60036: no more pending results, returning what we have 27885 1726882538.60041: in VariableManager get_vars() 27885 1726882538.60085: Calling all_inventory to load vars for managed_node2 27885 1726882538.60088: Calling groups_inventory to load vars for managed_node2 27885 1726882538.60091: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882538.60105: Calling all_plugins_play to load vars for managed_node2 27885 1726882538.60108: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882538.60111: Calling groups_plugins_play to load vars for managed_node2 27885 1726882538.60484: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882538.60699: done with get_vars() 27885 1726882538.60707: variable 'ansible_search_path' from source: unknown 27885 1726882538.60708: variable 'ansible_search_path' from source: unknown 27885 1726882538.60742: we have included files to process 27885 1726882538.60743: generating all_blocks data 27885 1726882538.60744: done generating all_blocks data 27885 1726882538.60745: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 27885 1726882538.60746: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 27885 1726882538.60748: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 27885 1726882538.60910: done processing included file 27885 1726882538.60912: iterating over new_blocks loaded from include file 27885 1726882538.60913: in VariableManager get_vars() 27885 1726882538.60931: done with get_vars() 27885 1726882538.60933: filtering new block on tags 27885 1726882538.60947: done filtering new block on tags 27885 1726882538.60949: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 27885 1726882538.60953: extending task lists for all hosts with included blocks 27885 1726882538.61077: done extending task lists 27885 1726882538.61078: done processing included files 27885 1726882538.61079: results queue empty 27885 1726882538.61080: checking for any_errors_fatal 27885 1726882538.61083: done checking for any_errors_fatal 27885 1726882538.61084: checking for max_fail_percentage 27885 1726882538.61085: done checking for max_fail_percentage 27885 1726882538.61085: checking to see if all hosts have failed and the running result is not ok 27885 1726882538.61086: done checking to see if all hosts have failed 27885 1726882538.61087: getting the remaining hosts for this loop 27885 1726882538.61088: done getting the remaining hosts for this loop 27885 1726882538.61090: getting the next task for host managed_node2 27885 1726882538.61096: done getting next task for host managed_node2 27885 1726882538.61098: ^ task is: TASK: Get stat for interface {{ interface }} 27885 1726882538.61101: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882538.61103: getting variables 27885 1726882538.61104: in VariableManager get_vars() 27885 1726882538.61117: Calling all_inventory to load vars for managed_node2 27885 1726882538.61119: Calling groups_inventory to load vars for managed_node2 27885 1726882538.61121: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882538.61125: Calling all_plugins_play to load vars for managed_node2 27885 1726882538.61127: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882538.61129: Calling groups_plugins_play to load vars for managed_node2 27885 1726882538.61247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882538.61433: done with get_vars() 27885 1726882538.61443: done getting variables 27885 1726882538.61596: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest1] ***************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:35:38 -0400 (0:00:00.029) 0:00:11.258 ****** 27885 1726882538.61627: entering _queue_task() for managed_node2/stat 27885 1726882538.61864: worker is 1 (out of 1 available) 27885 1726882538.61876: exiting _queue_task() for managed_node2/stat 27885 1726882538.61889: done queuing things up, now waiting for results queue to drain 27885 1726882538.61890: waiting for pending results... 27885 1726882538.62221: running TaskExecutor() for managed_node2/TASK: Get stat for interface ethtest1 27885 1726882538.62267: in run() - task 12673a56-9f93-3fa5-01be-000000000483 27885 1726882538.62318: variable 'ansible_search_path' from source: unknown 27885 1726882538.62322: variable 'ansible_search_path' from source: unknown 27885 1726882538.62343: calling self._execute() 27885 1726882538.62433: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882538.62498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882538.62503: variable 'omit' from source: magic vars 27885 1726882538.62826: variable 'ansible_distribution_major_version' from source: facts 27885 1726882538.62844: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882538.62858: variable 'omit' from source: magic vars 27885 1726882538.62908: variable 'omit' from source: magic vars 27885 1726882538.63011: variable 'interface' from source: set_fact 27885 1726882538.63034: variable 'omit' from source: magic vars 27885 1726882538.63081: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882538.63299: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882538.63302: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882538.63305: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882538.63307: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882538.63309: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882538.63311: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882538.63313: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882538.63324: Set connection var ansible_pipelining to False 27885 1726882538.63334: Set connection var ansible_connection to ssh 27885 1726882538.63343: Set connection var ansible_timeout to 10 27885 1726882538.63349: Set connection var ansible_shell_type to sh 27885 1726882538.63361: Set connection var ansible_shell_executable to /bin/sh 27885 1726882538.63371: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882538.63401: variable 'ansible_shell_executable' from source: unknown 27885 1726882538.63410: variable 'ansible_connection' from source: unknown 27885 1726882538.63418: variable 'ansible_module_compression' from source: unknown 27885 1726882538.63430: variable 'ansible_shell_type' from source: unknown 27885 1726882538.63437: variable 'ansible_shell_executable' from source: unknown 27885 1726882538.63444: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882538.63452: variable 'ansible_pipelining' from source: unknown 27885 1726882538.63458: variable 'ansible_timeout' from source: unknown 27885 1726882538.63467: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882538.63667: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 27885 1726882538.63684: variable 'omit' from source: magic vars 27885 1726882538.63698: starting attempt loop 27885 1726882538.63706: running the handler 27885 1726882538.63726: _low_level_execute_command(): starting 27885 1726882538.63738: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27885 1726882538.64442: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882538.64455: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882538.64509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882538.64569: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882538.64604: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882538.64688: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882538.66311: stdout chunk (state=3): >>>/root <<< 27885 1726882538.66457: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882538.66485: stdout chunk (state=3): >>><<< 27885 1726882538.66488: stderr chunk (state=3): >>><<< 27885 1726882538.66509: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882538.66607: _low_level_execute_command(): starting 27885 1726882538.66611: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882538.6651912-28477-122912550848490 `" && echo ansible-tmp-1726882538.6651912-28477-122912550848490="` echo /root/.ansible/tmp/ansible-tmp-1726882538.6651912-28477-122912550848490 `" ) && sleep 0' 27885 1726882538.67162: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882538.67175: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882538.67188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882538.67209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882538.67248: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration <<< 27885 1726882538.67306: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882538.67310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882538.67379: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882538.67410: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882538.67498: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882538.69356: stdout chunk (state=3): >>>ansible-tmp-1726882538.6651912-28477-122912550848490=/root/.ansible/tmp/ansible-tmp-1726882538.6651912-28477-122912550848490 <<< 27885 1726882538.69513: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882538.69516: stdout chunk (state=3): >>><<< 27885 1726882538.69518: stderr chunk (state=3): >>><<< 27885 1726882538.69699: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882538.6651912-28477-122912550848490=/root/.ansible/tmp/ansible-tmp-1726882538.6651912-28477-122912550848490 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882538.69703: variable 'ansible_module_compression' from source: unknown 27885 1726882538.69705: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27885_1t3g5hz/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 27885 1726882538.69707: variable 'ansible_facts' from source: unknown 27885 1726882538.69787: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882538.6651912-28477-122912550848490/AnsiballZ_stat.py 27885 1726882538.69950: Sending initial data 27885 1726882538.69959: Sent initial data (153 bytes) 27885 1726882538.70546: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882538.70563: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882538.70602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882538.70616: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882538.70717: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882538.70740: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882538.70755: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882538.70849: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882538.72370: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27885 1726882538.72446: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27885 1726882538.72528: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpg2xp1in7 /root/.ansible/tmp/ansible-tmp-1726882538.6651912-28477-122912550848490/AnsiballZ_stat.py <<< 27885 1726882538.72537: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882538.6651912-28477-122912550848490/AnsiballZ_stat.py" <<< 27885 1726882538.72584: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpg2xp1in7" to remote "/root/.ansible/tmp/ansible-tmp-1726882538.6651912-28477-122912550848490/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882538.6651912-28477-122912550848490/AnsiballZ_stat.py" <<< 27885 1726882538.73434: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882538.73540: stdout chunk (state=3): >>><<< 27885 1726882538.73543: stderr chunk (state=3): >>><<< 27885 1726882538.73546: done transferring module to remote 27885 1726882538.73548: _low_level_execute_command(): starting 27885 1726882538.73550: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882538.6651912-28477-122912550848490/ /root/.ansible/tmp/ansible-tmp-1726882538.6651912-28477-122912550848490/AnsiballZ_stat.py && sleep 0' 27885 1726882538.74302: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882538.74329: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882538.74346: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882538.74442: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882538.76173: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882538.76181: stdout chunk (state=3): >>><<< 27885 1726882538.76192: stderr chunk (state=3): >>><<< 27885 1726882538.76264: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882538.76268: _low_level_execute_command(): starting 27885 1726882538.76270: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882538.6651912-28477-122912550848490/AnsiballZ_stat.py && sleep 0' 27885 1726882538.76646: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882538.76650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882538.76652: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration <<< 27885 1726882538.76654: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882538.76656: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882538.76699: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882538.76705: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882538.76782: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882538.91749: stdout chunk (state=3): >>> <<< 27885 1726882538.91754: stdout chunk (state=3): >>>{"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 30415, "dev": 23, "nlink": 1, "atime": 1726882537.2201877, "mtime": 1726882537.2201877, "ctime": 1726882537.2201877, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest1", "lnk_target": "../../devices/virtual/net/ethtest1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 27885 1726882538.92986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 27885 1726882538.93020: stderr chunk (state=3): >>><<< 27885 1726882538.93024: stdout chunk (state=3): >>><<< 27885 1726882538.93040: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 30415, "dev": 23, "nlink": 1, "atime": 1726882537.2201877, "mtime": 1726882537.2201877, "ctime": 1726882537.2201877, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest1", "lnk_target": "../../devices/virtual/net/ethtest1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 27885 1726882538.93081: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882538.6651912-28477-122912550848490/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27885 1726882538.93092: _low_level_execute_command(): starting 27885 1726882538.93097: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882538.6651912-28477-122912550848490/ > /dev/null 2>&1 && sleep 0' 27885 1726882538.93553: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882538.93556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 27885 1726882538.93559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882538.93561: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882538.93563: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882538.93621: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882538.93631: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882538.93633: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882538.93688: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882538.95470: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882538.95498: stderr chunk (state=3): >>><<< 27885 1726882538.95501: stdout chunk (state=3): >>><<< 27885 1726882538.95514: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882538.95519: handler run complete 27885 1726882538.95587: attempt loop complete, returning result 27885 1726882538.95594: _execute() done 27885 1726882538.95597: dumping result to json 27885 1726882538.95600: done dumping result, returning 27885 1726882538.95607: done running TaskExecutor() for managed_node2/TASK: Get stat for interface ethtest1 [12673a56-9f93-3fa5-01be-000000000483] 27885 1726882538.95613: sending task result for task 12673a56-9f93-3fa5-01be-000000000483 27885 1726882538.95718: done sending task result for task 12673a56-9f93-3fa5-01be-000000000483 27885 1726882538.95721: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "atime": 1726882537.2201877, "block_size": 4096, "blocks": 0, "ctime": 1726882537.2201877, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 30415, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/ethtest1", "lnk_target": "../../devices/virtual/net/ethtest1", "mode": "0777", "mtime": 1726882537.2201877, "nlink": 1, "path": "/sys/class/net/ethtest1", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 27885 1726882538.95812: no more pending results, returning what we have 27885 1726882538.95816: results queue empty 27885 1726882538.95817: checking for any_errors_fatal 27885 1726882538.95818: done checking for any_errors_fatal 27885 1726882538.95819: checking for max_fail_percentage 27885 1726882538.95820: done checking for max_fail_percentage 27885 1726882538.95820: checking to see if all hosts have failed and the running result is not ok 27885 1726882538.95822: done checking to see if all hosts have failed 27885 1726882538.95822: getting the remaining hosts for this loop 27885 1726882538.95824: done getting the remaining hosts for this loop 27885 1726882538.95827: getting the next task for host managed_node2 27885 1726882538.95841: done getting next task for host managed_node2 27885 1726882538.95843: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 27885 1726882538.95846: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882538.95850: getting variables 27885 1726882538.95851: in VariableManager get_vars() 27885 1726882538.95886: Calling all_inventory to load vars for managed_node2 27885 1726882538.95889: Calling groups_inventory to load vars for managed_node2 27885 1726882538.95896: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882538.95906: Calling all_plugins_play to load vars for managed_node2 27885 1726882538.95908: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882538.95911: Calling groups_plugins_play to load vars for managed_node2 27885 1726882538.96081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882538.96218: done with get_vars() 27885 1726882538.96226: done getting variables 27885 1726882538.96267: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27885 1726882538.96356: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'ethtest1'] *********************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:35:38 -0400 (0:00:00.347) 0:00:11.606 ****** 27885 1726882538.96378: entering _queue_task() for managed_node2/assert 27885 1726882538.96578: worker is 1 (out of 1 available) 27885 1726882538.96596: exiting _queue_task() for managed_node2/assert 27885 1726882538.96609: done queuing things up, now waiting for results queue to drain 27885 1726882538.96610: waiting for pending results... 27885 1726882538.96761: running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'ethtest1' 27885 1726882538.96824: in run() - task 12673a56-9f93-3fa5-01be-0000000003ec 27885 1726882538.96841: variable 'ansible_search_path' from source: unknown 27885 1726882538.96845: variable 'ansible_search_path' from source: unknown 27885 1726882538.96869: calling self._execute() 27885 1726882538.96931: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882538.96939: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882538.96952: variable 'omit' from source: magic vars 27885 1726882538.97205: variable 'ansible_distribution_major_version' from source: facts 27885 1726882538.97214: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882538.97221: variable 'omit' from source: magic vars 27885 1726882538.97244: variable 'omit' from source: magic vars 27885 1726882538.97315: variable 'interface' from source: set_fact 27885 1726882538.97329: variable 'omit' from source: magic vars 27885 1726882538.97358: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882538.97395: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882538.97404: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882538.97417: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882538.97427: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882538.97450: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882538.97453: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882538.97455: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882538.97528: Set connection var ansible_pipelining to False 27885 1726882538.97531: Set connection var ansible_connection to ssh 27885 1726882538.97536: Set connection var ansible_timeout to 10 27885 1726882538.97538: Set connection var ansible_shell_type to sh 27885 1726882538.97544: Set connection var ansible_shell_executable to /bin/sh 27885 1726882538.97548: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882538.97567: variable 'ansible_shell_executable' from source: unknown 27885 1726882538.97570: variable 'ansible_connection' from source: unknown 27885 1726882538.97572: variable 'ansible_module_compression' from source: unknown 27885 1726882538.97575: variable 'ansible_shell_type' from source: unknown 27885 1726882538.97577: variable 'ansible_shell_executable' from source: unknown 27885 1726882538.97579: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882538.97582: variable 'ansible_pipelining' from source: unknown 27885 1726882538.97584: variable 'ansible_timeout' from source: unknown 27885 1726882538.97589: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882538.97686: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882538.97695: variable 'omit' from source: magic vars 27885 1726882538.97705: starting attempt loop 27885 1726882538.97707: running the handler 27885 1726882538.97803: variable 'interface_stat' from source: set_fact 27885 1726882538.97816: Evaluated conditional (interface_stat.stat.exists): True 27885 1726882538.97821: handler run complete 27885 1726882538.97835: attempt loop complete, returning result 27885 1726882538.97838: _execute() done 27885 1726882538.97840: dumping result to json 27885 1726882538.97843: done dumping result, returning 27885 1726882538.97849: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'ethtest1' [12673a56-9f93-3fa5-01be-0000000003ec] 27885 1726882538.97855: sending task result for task 12673a56-9f93-3fa5-01be-0000000003ec 27885 1726882538.97932: done sending task result for task 12673a56-9f93-3fa5-01be-0000000003ec 27885 1726882538.97935: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 27885 1726882538.97980: no more pending results, returning what we have 27885 1726882538.97983: results queue empty 27885 1726882538.97984: checking for any_errors_fatal 27885 1726882538.97995: done checking for any_errors_fatal 27885 1726882538.97996: checking for max_fail_percentage 27885 1726882538.97998: done checking for max_fail_percentage 27885 1726882538.97998: checking to see if all hosts have failed and the running result is not ok 27885 1726882538.97999: done checking to see if all hosts have failed 27885 1726882538.98000: getting the remaining hosts for this loop 27885 1726882538.98002: done getting the remaining hosts for this loop 27885 1726882538.98005: getting the next task for host managed_node2 27885 1726882538.98013: done getting next task for host managed_node2 27885 1726882538.98018: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 27885 1726882538.98021: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882538.98035: getting variables 27885 1726882538.98036: in VariableManager get_vars() 27885 1726882538.98075: Calling all_inventory to load vars for managed_node2 27885 1726882538.98077: Calling groups_inventory to load vars for managed_node2 27885 1726882538.98079: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882538.98086: Calling all_plugins_play to load vars for managed_node2 27885 1726882538.98089: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882538.98091: Calling groups_plugins_play to load vars for managed_node2 27885 1726882538.98216: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882538.98350: done with get_vars() 27885 1726882538.98357: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:35:38 -0400 (0:00:00.020) 0:00:11.626 ****** 27885 1726882538.98423: entering _queue_task() for managed_node2/include_tasks 27885 1726882538.98603: worker is 1 (out of 1 available) 27885 1726882538.98616: exiting _queue_task() for managed_node2/include_tasks 27885 1726882538.98629: done queuing things up, now waiting for results queue to drain 27885 1726882538.98630: waiting for pending results... 27885 1726882538.98785: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 27885 1726882538.98867: in run() - task 12673a56-9f93-3fa5-01be-00000000001b 27885 1726882538.98880: variable 'ansible_search_path' from source: unknown 27885 1726882538.98883: variable 'ansible_search_path' from source: unknown 27885 1726882538.98913: calling self._execute() 27885 1726882538.98968: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882538.98979: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882538.98986: variable 'omit' from source: magic vars 27885 1726882538.99304: variable 'ansible_distribution_major_version' from source: facts 27885 1726882538.99307: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882538.99310: _execute() done 27885 1726882538.99314: dumping result to json 27885 1726882538.99317: done dumping result, returning 27885 1726882538.99324: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-3fa5-01be-00000000001b] 27885 1726882538.99328: sending task result for task 12673a56-9f93-3fa5-01be-00000000001b 27885 1726882538.99407: done sending task result for task 12673a56-9f93-3fa5-01be-00000000001b 27885 1726882538.99410: WORKER PROCESS EXITING 27885 1726882538.99449: no more pending results, returning what we have 27885 1726882538.99453: in VariableManager get_vars() 27885 1726882538.99497: Calling all_inventory to load vars for managed_node2 27885 1726882538.99500: Calling groups_inventory to load vars for managed_node2 27885 1726882538.99502: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882538.99510: Calling all_plugins_play to load vars for managed_node2 27885 1726882538.99512: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882538.99515: Calling groups_plugins_play to load vars for managed_node2 27885 1726882538.99677: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882538.99804: done with get_vars() 27885 1726882538.99810: variable 'ansible_search_path' from source: unknown 27885 1726882538.99810: variable 'ansible_search_path' from source: unknown 27885 1726882538.99834: we have included files to process 27885 1726882538.99835: generating all_blocks data 27885 1726882538.99836: done generating all_blocks data 27885 1726882538.99839: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 27885 1726882538.99840: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 27885 1726882538.99841: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 27885 1726882539.00283: done processing included file 27885 1726882539.00284: iterating over new_blocks loaded from include file 27885 1726882539.00285: in VariableManager get_vars() 27885 1726882539.00304: done with get_vars() 27885 1726882539.00306: filtering new block on tags 27885 1726882539.00317: done filtering new block on tags 27885 1726882539.00319: in VariableManager get_vars() 27885 1726882539.00332: done with get_vars() 27885 1726882539.00333: filtering new block on tags 27885 1726882539.00345: done filtering new block on tags 27885 1726882539.00346: in VariableManager get_vars() 27885 1726882539.00358: done with get_vars() 27885 1726882539.00359: filtering new block on tags 27885 1726882539.00368: done filtering new block on tags 27885 1726882539.00369: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 27885 1726882539.00373: extending task lists for all hosts with included blocks 27885 1726882539.00854: done extending task lists 27885 1726882539.00855: done processing included files 27885 1726882539.00856: results queue empty 27885 1726882539.00856: checking for any_errors_fatal 27885 1726882539.00858: done checking for any_errors_fatal 27885 1726882539.00858: checking for max_fail_percentage 27885 1726882539.00859: done checking for max_fail_percentage 27885 1726882539.00859: checking to see if all hosts have failed and the running result is not ok 27885 1726882539.00860: done checking to see if all hosts have failed 27885 1726882539.00861: getting the remaining hosts for this loop 27885 1726882539.00861: done getting the remaining hosts for this loop 27885 1726882539.00863: getting the next task for host managed_node2 27885 1726882539.00865: done getting next task for host managed_node2 27885 1726882539.00867: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 27885 1726882539.00869: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882539.00875: getting variables 27885 1726882539.00875: in VariableManager get_vars() 27885 1726882539.00885: Calling all_inventory to load vars for managed_node2 27885 1726882539.00886: Calling groups_inventory to load vars for managed_node2 27885 1726882539.00887: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882539.00894: Calling all_plugins_play to load vars for managed_node2 27885 1726882539.00896: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882539.00898: Calling groups_plugins_play to load vars for managed_node2 27885 1726882539.00982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882539.01125: done with get_vars() 27885 1726882539.01132: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:35:39 -0400 (0:00:00.027) 0:00:11.654 ****** 27885 1726882539.01178: entering _queue_task() for managed_node2/setup 27885 1726882539.01356: worker is 1 (out of 1 available) 27885 1726882539.01368: exiting _queue_task() for managed_node2/setup 27885 1726882539.01380: done queuing things up, now waiting for results queue to drain 27885 1726882539.01382: waiting for pending results... 27885 1726882539.01711: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 27885 1726882539.01717: in run() - task 12673a56-9f93-3fa5-01be-00000000049b 27885 1726882539.01723: variable 'ansible_search_path' from source: unknown 27885 1726882539.01732: variable 'ansible_search_path' from source: unknown 27885 1726882539.01771: calling self._execute() 27885 1726882539.01855: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882539.01867: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882539.01882: variable 'omit' from source: magic vars 27885 1726882539.02242: variable 'ansible_distribution_major_version' from source: facts 27885 1726882539.02260: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882539.02528: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27885 1726882539.04729: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27885 1726882539.04813: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27885 1726882539.04895: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27885 1726882539.04903: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27885 1726882539.04937: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27885 1726882539.05039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882539.05075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882539.05114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882539.05198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882539.05201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882539.05246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882539.05276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882539.05311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882539.05361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882539.05399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882539.05558: variable '__network_required_facts' from source: role '' defaults 27885 1726882539.05575: variable 'ansible_facts' from source: unknown 27885 1726882539.05761: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 27885 1726882539.05765: when evaluation is False, skipping this task 27885 1726882539.05767: _execute() done 27885 1726882539.05769: dumping result to json 27885 1726882539.05772: done dumping result, returning 27885 1726882539.05774: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12673a56-9f93-3fa5-01be-00000000049b] 27885 1726882539.05776: sending task result for task 12673a56-9f93-3fa5-01be-00000000049b 27885 1726882539.05846: done sending task result for task 12673a56-9f93-3fa5-01be-00000000049b 27885 1726882539.05850: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 27885 1726882539.05898: no more pending results, returning what we have 27885 1726882539.05903: results queue empty 27885 1726882539.05904: checking for any_errors_fatal 27885 1726882539.05906: done checking for any_errors_fatal 27885 1726882539.05907: checking for max_fail_percentage 27885 1726882539.05908: done checking for max_fail_percentage 27885 1726882539.05909: checking to see if all hosts have failed and the running result is not ok 27885 1726882539.05910: done checking to see if all hosts have failed 27885 1726882539.05910: getting the remaining hosts for this loop 27885 1726882539.05912: done getting the remaining hosts for this loop 27885 1726882539.05916: getting the next task for host managed_node2 27885 1726882539.05925: done getting next task for host managed_node2 27885 1726882539.05929: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 27885 1726882539.05934: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882539.05948: getting variables 27885 1726882539.05949: in VariableManager get_vars() 27885 1726882539.05999: Calling all_inventory to load vars for managed_node2 27885 1726882539.06002: Calling groups_inventory to load vars for managed_node2 27885 1726882539.06005: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882539.06017: Calling all_plugins_play to load vars for managed_node2 27885 1726882539.06020: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882539.06024: Calling groups_plugins_play to load vars for managed_node2 27885 1726882539.06472: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882539.06811: done with get_vars() 27885 1726882539.06824: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:35:39 -0400 (0:00:00.057) 0:00:11.711 ****** 27885 1726882539.06921: entering _queue_task() for managed_node2/stat 27885 1726882539.07168: worker is 1 (out of 1 available) 27885 1726882539.07182: exiting _queue_task() for managed_node2/stat 27885 1726882539.07397: done queuing things up, now waiting for results queue to drain 27885 1726882539.07399: waiting for pending results... 27885 1726882539.07527: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 27885 1726882539.07733: in run() - task 12673a56-9f93-3fa5-01be-00000000049d 27885 1726882539.07737: variable 'ansible_search_path' from source: unknown 27885 1726882539.07739: variable 'ansible_search_path' from source: unknown 27885 1726882539.07743: calling self._execute() 27885 1726882539.07757: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882539.07771: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882539.07785: variable 'omit' from source: magic vars 27885 1726882539.08133: variable 'ansible_distribution_major_version' from source: facts 27885 1726882539.08148: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882539.08313: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27885 1726882539.08578: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27885 1726882539.08628: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27885 1726882539.08663: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27885 1726882539.08705: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27885 1726882539.08787: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27885 1726882539.08823: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27885 1726882539.08852: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882539.08880: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27885 1726882539.08968: variable '__network_is_ostree' from source: set_fact 27885 1726882539.08980: Evaluated conditional (not __network_is_ostree is defined): False 27885 1726882539.08987: when evaluation is False, skipping this task 27885 1726882539.09037: _execute() done 27885 1726882539.09040: dumping result to json 27885 1726882539.09043: done dumping result, returning 27885 1726882539.09045: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [12673a56-9f93-3fa5-01be-00000000049d] 27885 1726882539.09047: sending task result for task 12673a56-9f93-3fa5-01be-00000000049d skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 27885 1726882539.09191: no more pending results, returning what we have 27885 1726882539.09197: results queue empty 27885 1726882539.09199: checking for any_errors_fatal 27885 1726882539.09205: done checking for any_errors_fatal 27885 1726882539.09206: checking for max_fail_percentage 27885 1726882539.09207: done checking for max_fail_percentage 27885 1726882539.09208: checking to see if all hosts have failed and the running result is not ok 27885 1726882539.09209: done checking to see if all hosts have failed 27885 1726882539.09210: getting the remaining hosts for this loop 27885 1726882539.09212: done getting the remaining hosts for this loop 27885 1726882539.09217: getting the next task for host managed_node2 27885 1726882539.09225: done getting next task for host managed_node2 27885 1726882539.09228: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 27885 1726882539.09234: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882539.09248: getting variables 27885 1726882539.09250: in VariableManager get_vars() 27885 1726882539.09497: Calling all_inventory to load vars for managed_node2 27885 1726882539.09501: Calling groups_inventory to load vars for managed_node2 27885 1726882539.09504: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882539.09510: done sending task result for task 12673a56-9f93-3fa5-01be-00000000049d 27885 1726882539.09512: WORKER PROCESS EXITING 27885 1726882539.09520: Calling all_plugins_play to load vars for managed_node2 27885 1726882539.09523: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882539.09525: Calling groups_plugins_play to load vars for managed_node2 27885 1726882539.09714: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882539.09935: done with get_vars() 27885 1726882539.09946: done getting variables 27885 1726882539.09998: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:35:39 -0400 (0:00:00.031) 0:00:11.742 ****** 27885 1726882539.10031: entering _queue_task() for managed_node2/set_fact 27885 1726882539.10246: worker is 1 (out of 1 available) 27885 1726882539.10257: exiting _queue_task() for managed_node2/set_fact 27885 1726882539.10268: done queuing things up, now waiting for results queue to drain 27885 1726882539.10270: waiting for pending results... 27885 1726882539.10518: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 27885 1726882539.10659: in run() - task 12673a56-9f93-3fa5-01be-00000000049e 27885 1726882539.10678: variable 'ansible_search_path' from source: unknown 27885 1726882539.10686: variable 'ansible_search_path' from source: unknown 27885 1726882539.10734: calling self._execute() 27885 1726882539.10814: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882539.10831: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882539.10843: variable 'omit' from source: magic vars 27885 1726882539.11264: variable 'ansible_distribution_major_version' from source: facts 27885 1726882539.11280: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882539.11441: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27885 1726882539.11706: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27885 1726882539.11898: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27885 1726882539.11901: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27885 1726882539.11903: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27885 1726882539.11906: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27885 1726882539.11927: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27885 1726882539.11955: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882539.11984: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27885 1726882539.12074: variable '__network_is_ostree' from source: set_fact 27885 1726882539.12086: Evaluated conditional (not __network_is_ostree is defined): False 27885 1726882539.12095: when evaluation is False, skipping this task 27885 1726882539.12103: _execute() done 27885 1726882539.12110: dumping result to json 27885 1726882539.12117: done dumping result, returning 27885 1726882539.12131: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12673a56-9f93-3fa5-01be-00000000049e] 27885 1726882539.12141: sending task result for task 12673a56-9f93-3fa5-01be-00000000049e skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 27885 1726882539.12274: no more pending results, returning what we have 27885 1726882539.12279: results queue empty 27885 1726882539.12280: checking for any_errors_fatal 27885 1726882539.12287: done checking for any_errors_fatal 27885 1726882539.12288: checking for max_fail_percentage 27885 1726882539.12290: done checking for max_fail_percentage 27885 1726882539.12291: checking to see if all hosts have failed and the running result is not ok 27885 1726882539.12291: done checking to see if all hosts have failed 27885 1726882539.12292: getting the remaining hosts for this loop 27885 1726882539.12295: done getting the remaining hosts for this loop 27885 1726882539.12299: getting the next task for host managed_node2 27885 1726882539.12308: done getting next task for host managed_node2 27885 1726882539.12312: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 27885 1726882539.12317: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882539.12329: getting variables 27885 1726882539.12331: in VariableManager get_vars() 27885 1726882539.12374: Calling all_inventory to load vars for managed_node2 27885 1726882539.12376: Calling groups_inventory to load vars for managed_node2 27885 1726882539.12379: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882539.12389: Calling all_plugins_play to load vars for managed_node2 27885 1726882539.12391: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882539.12500: Calling groups_plugins_play to load vars for managed_node2 27885 1726882539.12901: done sending task result for task 12673a56-9f93-3fa5-01be-00000000049e 27885 1726882539.12905: WORKER PROCESS EXITING 27885 1726882539.12925: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882539.13150: done with get_vars() 27885 1726882539.13161: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:35:39 -0400 (0:00:00.032) 0:00:11.775 ****** 27885 1726882539.13252: entering _queue_task() for managed_node2/service_facts 27885 1726882539.13254: Creating lock for service_facts 27885 1726882539.13518: worker is 1 (out of 1 available) 27885 1726882539.13529: exiting _queue_task() for managed_node2/service_facts 27885 1726882539.13543: done queuing things up, now waiting for results queue to drain 27885 1726882539.13545: waiting for pending results... 27885 1726882539.13806: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 27885 1726882539.13950: in run() - task 12673a56-9f93-3fa5-01be-0000000004a0 27885 1726882539.13969: variable 'ansible_search_path' from source: unknown 27885 1726882539.13976: variable 'ansible_search_path' from source: unknown 27885 1726882539.14017: calling self._execute() 27885 1726882539.14104: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882539.14116: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882539.14128: variable 'omit' from source: magic vars 27885 1726882539.14491: variable 'ansible_distribution_major_version' from source: facts 27885 1726882539.14511: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882539.14522: variable 'omit' from source: magic vars 27885 1726882539.14598: variable 'omit' from source: magic vars 27885 1726882539.14636: variable 'omit' from source: magic vars 27885 1726882539.14678: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882539.14722: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882539.14748: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882539.14769: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882539.14786: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882539.14833: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882539.14900: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882539.14903: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882539.14952: Set connection var ansible_pipelining to False 27885 1726882539.14961: Set connection var ansible_connection to ssh 27885 1726882539.14968: Set connection var ansible_timeout to 10 27885 1726882539.14973: Set connection var ansible_shell_type to sh 27885 1726882539.14980: Set connection var ansible_shell_executable to /bin/sh 27885 1726882539.14987: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882539.15017: variable 'ansible_shell_executable' from source: unknown 27885 1726882539.15023: variable 'ansible_connection' from source: unknown 27885 1726882539.15029: variable 'ansible_module_compression' from source: unknown 27885 1726882539.15034: variable 'ansible_shell_type' from source: unknown 27885 1726882539.15039: variable 'ansible_shell_executable' from source: unknown 27885 1726882539.15043: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882539.15048: variable 'ansible_pipelining' from source: unknown 27885 1726882539.15053: variable 'ansible_timeout' from source: unknown 27885 1726882539.15058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882539.15399: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 27885 1726882539.15403: variable 'omit' from source: magic vars 27885 1726882539.15406: starting attempt loop 27885 1726882539.15408: running the handler 27885 1726882539.15411: _low_level_execute_command(): starting 27885 1726882539.15413: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27885 1726882539.16038: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882539.16057: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882539.16078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882539.16190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882539.16217: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882539.16324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882539.17922: stdout chunk (state=3): >>>/root <<< 27885 1726882539.18063: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882539.18107: stdout chunk (state=3): >>><<< 27885 1726882539.18110: stderr chunk (state=3): >>><<< 27885 1726882539.18128: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882539.18149: _low_level_execute_command(): starting 27885 1726882539.18161: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882539.181351-28505-274139508285559 `" && echo ansible-tmp-1726882539.181351-28505-274139508285559="` echo /root/.ansible/tmp/ansible-tmp-1726882539.181351-28505-274139508285559 `" ) && sleep 0' 27885 1726882539.18789: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882539.18809: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882539.18825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882539.18850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882539.18919: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882539.18985: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882539.19016: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882539.19031: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882539.19225: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882539.21079: stdout chunk (state=3): >>>ansible-tmp-1726882539.181351-28505-274139508285559=/root/.ansible/tmp/ansible-tmp-1726882539.181351-28505-274139508285559 <<< 27885 1726882539.21243: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882539.21246: stdout chunk (state=3): >>><<< 27885 1726882539.21248: stderr chunk (state=3): >>><<< 27885 1726882539.21407: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882539.181351-28505-274139508285559=/root/.ansible/tmp/ansible-tmp-1726882539.181351-28505-274139508285559 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882539.21411: variable 'ansible_module_compression' from source: unknown 27885 1726882539.21413: ANSIBALLZ: Using lock for service_facts 27885 1726882539.21415: ANSIBALLZ: Acquiring lock 27885 1726882539.21417: ANSIBALLZ: Lock acquired: 140560082078464 27885 1726882539.21419: ANSIBALLZ: Creating module 27885 1726882539.34285: ANSIBALLZ: Writing module into payload 27885 1726882539.34381: ANSIBALLZ: Writing module 27885 1726882539.34411: ANSIBALLZ: Renaming module 27885 1726882539.34419: ANSIBALLZ: Done creating module 27885 1726882539.34435: variable 'ansible_facts' from source: unknown 27885 1726882539.34509: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882539.181351-28505-274139508285559/AnsiballZ_service_facts.py 27885 1726882539.34735: Sending initial data 27885 1726882539.34739: Sent initial data (161 bytes) 27885 1726882539.35228: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882539.35232: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882539.35240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882539.35257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882539.35309: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882539.35359: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882539.35377: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882539.35387: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882539.35477: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882539.37035: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27885 1726882539.37109: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27885 1726882539.37178: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmplscqwb61 /root/.ansible/tmp/ansible-tmp-1726882539.181351-28505-274139508285559/AnsiballZ_service_facts.py <<< 27885 1726882539.37187: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882539.181351-28505-274139508285559/AnsiballZ_service_facts.py" <<< 27885 1726882539.37257: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmplscqwb61" to remote "/root/.ansible/tmp/ansible-tmp-1726882539.181351-28505-274139508285559/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882539.181351-28505-274139508285559/AnsiballZ_service_facts.py" <<< 27885 1726882539.38055: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882539.38091: stderr chunk (state=3): >>><<< 27885 1726882539.38098: stdout chunk (state=3): >>><<< 27885 1726882539.38123: done transferring module to remote 27885 1726882539.38134: _low_level_execute_command(): starting 27885 1726882539.38137: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882539.181351-28505-274139508285559/ /root/.ansible/tmp/ansible-tmp-1726882539.181351-28505-274139508285559/AnsiballZ_service_facts.py && sleep 0' 27885 1726882539.38549: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882539.38552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882539.38580: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 27885 1726882539.38583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 27885 1726882539.38586: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882539.38600: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882539.38645: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882539.38649: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882539.38721: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882539.40579: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882539.40582: stdout chunk (state=3): >>><<< 27885 1726882539.40584: stderr chunk (state=3): >>><<< 27885 1726882539.40587: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882539.40588: _low_level_execute_command(): starting 27885 1726882539.40594: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882539.181351-28505-274139508285559/AnsiballZ_service_facts.py && sleep 0' 27885 1726882539.41139: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882539.41143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882539.41146: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration <<< 27885 1726882539.41148: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882539.41156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882539.41201: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882539.41217: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882539.41287: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882540.92678: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-ma<<< 27885 1726882540.92696: stdout chunk (state=3): >>>rk.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.s<<< 27885 1726882540.92722: stdout chunk (state=3): >>>ervice", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "stat<<< 27885 1726882540.92748: stdout chunk (state=3): >>>us": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state":<<< 27885 1726882540.92758: stdout chunk (state=3): >>> "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": <<< 27885 1726882540.92764: stdout chunk (state=3): >>>"static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 27885 1726882540.94208: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 27885 1726882540.94230: stderr chunk (state=3): >>><<< 27885 1726882540.94233: stdout chunk (state=3): >>><<< 27885 1726882540.94257: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 27885 1726882540.95458: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882539.181351-28505-274139508285559/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27885 1726882540.95468: _low_level_execute_command(): starting 27885 1726882540.95471: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882539.181351-28505-274139508285559/ > /dev/null 2>&1 && sleep 0' 27885 1726882540.96041: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882540.96069: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882540.96203: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882540.97980: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882540.98005: stdout chunk (state=3): >>><<< 27885 1726882540.98018: stderr chunk (state=3): >>><<< 27885 1726882540.98198: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882540.98202: handler run complete 27885 1726882540.98246: variable 'ansible_facts' from source: unknown 27885 1726882540.98415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882540.99188: variable 'ansible_facts' from source: unknown 27885 1726882540.99376: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882540.99900: attempt loop complete, returning result 27885 1726882540.99904: _execute() done 27885 1726882540.99906: dumping result to json 27885 1726882541.00035: done dumping result, returning 27885 1726882541.00050: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [12673a56-9f93-3fa5-01be-0000000004a0] 27885 1726882541.00061: sending task result for task 12673a56-9f93-3fa5-01be-0000000004a0 27885 1726882541.02453: done sending task result for task 12673a56-9f93-3fa5-01be-0000000004a0 27885 1726882541.02457: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 27885 1726882541.02552: no more pending results, returning what we have 27885 1726882541.02555: results queue empty 27885 1726882541.02556: checking for any_errors_fatal 27885 1726882541.02559: done checking for any_errors_fatal 27885 1726882541.02559: checking for max_fail_percentage 27885 1726882541.02561: done checking for max_fail_percentage 27885 1726882541.02562: checking to see if all hosts have failed and the running result is not ok 27885 1726882541.02562: done checking to see if all hosts have failed 27885 1726882541.02563: getting the remaining hosts for this loop 27885 1726882541.02564: done getting the remaining hosts for this loop 27885 1726882541.02567: getting the next task for host managed_node2 27885 1726882541.02571: done getting next task for host managed_node2 27885 1726882541.02575: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 27885 1726882541.02579: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882541.02588: getting variables 27885 1726882541.02592: in VariableManager get_vars() 27885 1726882541.02628: Calling all_inventory to load vars for managed_node2 27885 1726882541.02631: Calling groups_inventory to load vars for managed_node2 27885 1726882541.02633: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882541.02642: Calling all_plugins_play to load vars for managed_node2 27885 1726882541.02644: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882541.02647: Calling groups_plugins_play to load vars for managed_node2 27885 1726882541.03370: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882541.04191: done with get_vars() 27885 1726882541.04206: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:35:41 -0400 (0:00:01.910) 0:00:13.685 ****** 27885 1726882541.04298: entering _queue_task() for managed_node2/package_facts 27885 1726882541.04300: Creating lock for package_facts 27885 1726882541.04580: worker is 1 (out of 1 available) 27885 1726882541.04591: exiting _queue_task() for managed_node2/package_facts 27885 1726882541.04605: done queuing things up, now waiting for results queue to drain 27885 1726882541.04606: waiting for pending results... 27885 1726882541.04871: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 27885 1726882541.05201: in run() - task 12673a56-9f93-3fa5-01be-0000000004a1 27885 1726882541.05205: variable 'ansible_search_path' from source: unknown 27885 1726882541.05208: variable 'ansible_search_path' from source: unknown 27885 1726882541.05210: calling self._execute() 27885 1726882541.05212: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882541.05214: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882541.05218: variable 'omit' from source: magic vars 27885 1726882541.05534: variable 'ansible_distribution_major_version' from source: facts 27885 1726882541.05550: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882541.05561: variable 'omit' from source: magic vars 27885 1726882541.05635: variable 'omit' from source: magic vars 27885 1726882541.05672: variable 'omit' from source: magic vars 27885 1726882541.05714: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882541.05758: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882541.05784: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882541.05810: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882541.05828: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882541.05867: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882541.05878: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882541.05885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882541.05995: Set connection var ansible_pipelining to False 27885 1726882541.06007: Set connection var ansible_connection to ssh 27885 1726882541.06019: Set connection var ansible_timeout to 10 27885 1726882541.06026: Set connection var ansible_shell_type to sh 27885 1726882541.06035: Set connection var ansible_shell_executable to /bin/sh 27885 1726882541.06044: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882541.06076: variable 'ansible_shell_executable' from source: unknown 27885 1726882541.06085: variable 'ansible_connection' from source: unknown 27885 1726882541.06095: variable 'ansible_module_compression' from source: unknown 27885 1726882541.06103: variable 'ansible_shell_type' from source: unknown 27885 1726882541.06110: variable 'ansible_shell_executable' from source: unknown 27885 1726882541.06117: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882541.06123: variable 'ansible_pipelining' from source: unknown 27885 1726882541.06129: variable 'ansible_timeout' from source: unknown 27885 1726882541.06135: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882541.06329: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 27885 1726882541.06345: variable 'omit' from source: magic vars 27885 1726882541.06354: starting attempt loop 27885 1726882541.06360: running the handler 27885 1726882541.06377: _low_level_execute_command(): starting 27885 1726882541.06391: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27885 1726882541.07065: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882541.07081: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882541.07096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882541.07118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882541.07173: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882541.07232: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882541.07247: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882541.07280: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882541.07385: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882541.08939: stdout chunk (state=3): >>>/root <<< 27885 1726882541.09094: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882541.09098: stdout chunk (state=3): >>><<< 27885 1726882541.09100: stderr chunk (state=3): >>><<< 27885 1726882541.09218: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882541.09222: _low_level_execute_command(): starting 27885 1726882541.09225: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882541.091204-28559-151324001392476 `" && echo ansible-tmp-1726882541.091204-28559-151324001392476="` echo /root/.ansible/tmp/ansible-tmp-1726882541.091204-28559-151324001392476 `" ) && sleep 0' 27885 1726882541.10160: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882541.10169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 27885 1726882541.10179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882541.10181: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882541.10184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882541.10427: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882541.10431: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882541.10511: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882541.12366: stdout chunk (state=3): >>>ansible-tmp-1726882541.091204-28559-151324001392476=/root/.ansible/tmp/ansible-tmp-1726882541.091204-28559-151324001392476 <<< 27885 1726882541.12468: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882541.12506: stderr chunk (state=3): >>><<< 27885 1726882541.12515: stdout chunk (state=3): >>><<< 27885 1726882541.12999: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882541.091204-28559-151324001392476=/root/.ansible/tmp/ansible-tmp-1726882541.091204-28559-151324001392476 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882541.13004: variable 'ansible_module_compression' from source: unknown 27885 1726882541.13007: ANSIBALLZ: Using lock for package_facts 27885 1726882541.13009: ANSIBALLZ: Acquiring lock 27885 1726882541.13011: ANSIBALLZ: Lock acquired: 140560081159664 27885 1726882541.13013: ANSIBALLZ: Creating module 27885 1726882541.63175: ANSIBALLZ: Writing module into payload 27885 1726882541.63367: ANSIBALLZ: Writing module 27885 1726882541.63434: ANSIBALLZ: Renaming module 27885 1726882541.63445: ANSIBALLZ: Done creating module 27885 1726882541.63482: variable 'ansible_facts' from source: unknown 27885 1726882541.63744: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882541.091204-28559-151324001392476/AnsiballZ_package_facts.py 27885 1726882541.63967: Sending initial data 27885 1726882541.63975: Sent initial data (161 bytes) 27885 1726882541.64501: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882541.64516: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882541.64530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882541.64615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882541.64652: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882541.64678: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882541.64702: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882541.64796: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882541.66503: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27885 1726882541.66558: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27885 1726882541.66648: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmp3tvh76m6 /root/.ansible/tmp/ansible-tmp-1726882541.091204-28559-151324001392476/AnsiballZ_package_facts.py <<< 27885 1726882541.66651: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882541.091204-28559-151324001392476/AnsiballZ_package_facts.py" <<< 27885 1726882541.66701: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmp3tvh76m6" to remote "/root/.ansible/tmp/ansible-tmp-1726882541.091204-28559-151324001392476/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882541.091204-28559-151324001392476/AnsiballZ_package_facts.py" <<< 27885 1726882541.68392: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882541.68536: stderr chunk (state=3): >>><<< 27885 1726882541.68539: stdout chunk (state=3): >>><<< 27885 1726882541.68541: done transferring module to remote 27885 1726882541.68543: _low_level_execute_command(): starting 27885 1726882541.68545: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882541.091204-28559-151324001392476/ /root/.ansible/tmp/ansible-tmp-1726882541.091204-28559-151324001392476/AnsiballZ_package_facts.py && sleep 0' 27885 1726882541.69079: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882541.69099: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882541.69115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882541.69144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882541.69251: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882541.69264: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882541.69278: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882541.69307: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882541.69397: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882541.71225: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882541.71229: stdout chunk (state=3): >>><<< 27885 1726882541.71231: stderr chunk (state=3): >>><<< 27885 1726882541.71272: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882541.71345: _low_level_execute_command(): starting 27885 1726882541.71348: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882541.091204-28559-151324001392476/AnsiballZ_package_facts.py && sleep 0' 27885 1726882541.72201: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882541.72212: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882541.72225: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882541.72315: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882542.16161: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 27885 1726882542.16504: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 27885 1726882542.16518: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 27885 1726882542.17992: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882542.18008: stderr chunk (state=3): >>>Shared connection to 10.31.14.69 closed. <<< 27885 1726882542.18061: stderr chunk (state=3): >>><<< 27885 1726882542.18071: stdout chunk (state=3): >>><<< 27885 1726882542.18114: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 27885 1726882542.21487: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882541.091204-28559-151324001392476/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27885 1726882542.21528: _low_level_execute_command(): starting 27885 1726882542.21596: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882541.091204-28559-151324001392476/ > /dev/null 2>&1 && sleep 0' 27885 1726882542.22145: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882542.22162: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882542.22176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882542.22197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882542.22217: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882542.22231: stderr chunk (state=3): >>>debug2: match not found <<< 27885 1726882542.22247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882542.22267: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27885 1726882542.22352: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882542.22377: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882542.22475: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882542.24376: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882542.24389: stdout chunk (state=3): >>><<< 27885 1726882542.24698: stderr chunk (state=3): >>><<< 27885 1726882542.24702: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882542.24705: handler run complete 27885 1726882542.26317: variable 'ansible_facts' from source: unknown 27885 1726882542.26805: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882542.28964: variable 'ansible_facts' from source: unknown 27885 1726882542.29799: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882542.30480: attempt loop complete, returning result 27885 1726882542.30499: _execute() done 27885 1726882542.30502: dumping result to json 27885 1726882542.30866: done dumping result, returning 27885 1726882542.30874: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [12673a56-9f93-3fa5-01be-0000000004a1] 27885 1726882542.30877: sending task result for task 12673a56-9f93-3fa5-01be-0000000004a1 27885 1726882542.33422: done sending task result for task 12673a56-9f93-3fa5-01be-0000000004a1 27885 1726882542.33426: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 27885 1726882542.33527: no more pending results, returning what we have 27885 1726882542.33530: results queue empty 27885 1726882542.33531: checking for any_errors_fatal 27885 1726882542.33536: done checking for any_errors_fatal 27885 1726882542.33537: checking for max_fail_percentage 27885 1726882542.33538: done checking for max_fail_percentage 27885 1726882542.33539: checking to see if all hosts have failed and the running result is not ok 27885 1726882542.33540: done checking to see if all hosts have failed 27885 1726882542.33541: getting the remaining hosts for this loop 27885 1726882542.33542: done getting the remaining hosts for this loop 27885 1726882542.33545: getting the next task for host managed_node2 27885 1726882542.33551: done getting next task for host managed_node2 27885 1726882542.33555: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 27885 1726882542.33558: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882542.33568: getting variables 27885 1726882542.33569: in VariableManager get_vars() 27885 1726882542.33605: Calling all_inventory to load vars for managed_node2 27885 1726882542.33608: Calling groups_inventory to load vars for managed_node2 27885 1726882542.33610: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882542.33620: Calling all_plugins_play to load vars for managed_node2 27885 1726882542.33623: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882542.33626: Calling groups_plugins_play to load vars for managed_node2 27885 1726882542.35288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882542.37138: done with get_vars() 27885 1726882542.37158: done getting variables 27885 1726882542.37377: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:35:42 -0400 (0:00:01.331) 0:00:15.016 ****** 27885 1726882542.37425: entering _queue_task() for managed_node2/debug 27885 1726882542.37897: worker is 1 (out of 1 available) 27885 1726882542.37907: exiting _queue_task() for managed_node2/debug 27885 1726882542.37917: done queuing things up, now waiting for results queue to drain 27885 1726882542.37919: waiting for pending results... 27885 1726882542.38048: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 27885 1726882542.38256: in run() - task 12673a56-9f93-3fa5-01be-00000000001c 27885 1726882542.38259: variable 'ansible_search_path' from source: unknown 27885 1726882542.38262: variable 'ansible_search_path' from source: unknown 27885 1726882542.38264: calling self._execute() 27885 1726882542.38333: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882542.38345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882542.38364: variable 'omit' from source: magic vars 27885 1726882542.38740: variable 'ansible_distribution_major_version' from source: facts 27885 1726882542.38756: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882542.38766: variable 'omit' from source: magic vars 27885 1726882542.38830: variable 'omit' from source: magic vars 27885 1726882542.38935: variable 'network_provider' from source: set_fact 27885 1726882542.38958: variable 'omit' from source: magic vars 27885 1726882542.39003: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882542.39099: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882542.39102: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882542.39104: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882542.39107: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882542.39144: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882542.39152: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882542.39160: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882542.39269: Set connection var ansible_pipelining to False 27885 1726882542.39279: Set connection var ansible_connection to ssh 27885 1726882542.39295: Set connection var ansible_timeout to 10 27885 1726882542.39303: Set connection var ansible_shell_type to sh 27885 1726882542.39312: Set connection var ansible_shell_executable to /bin/sh 27885 1726882542.39343: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882542.39354: variable 'ansible_shell_executable' from source: unknown 27885 1726882542.39361: variable 'ansible_connection' from source: unknown 27885 1726882542.39368: variable 'ansible_module_compression' from source: unknown 27885 1726882542.39374: variable 'ansible_shell_type' from source: unknown 27885 1726882542.39452: variable 'ansible_shell_executable' from source: unknown 27885 1726882542.39455: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882542.39458: variable 'ansible_pipelining' from source: unknown 27885 1726882542.39460: variable 'ansible_timeout' from source: unknown 27885 1726882542.39462: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882542.39547: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882542.39569: variable 'omit' from source: magic vars 27885 1726882542.39578: starting attempt loop 27885 1726882542.39584: running the handler 27885 1726882542.39633: handler run complete 27885 1726882542.39652: attempt loop complete, returning result 27885 1726882542.39659: _execute() done 27885 1726882542.39673: dumping result to json 27885 1726882542.39680: done dumping result, returning 27885 1726882542.39696: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-3fa5-01be-00000000001c] 27885 1726882542.39708: sending task result for task 12673a56-9f93-3fa5-01be-00000000001c 27885 1726882542.39840: done sending task result for task 12673a56-9f93-3fa5-01be-00000000001c 27885 1726882542.39843: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 27885 1726882542.39942: no more pending results, returning what we have 27885 1726882542.39946: results queue empty 27885 1726882542.39947: checking for any_errors_fatal 27885 1726882542.39957: done checking for any_errors_fatal 27885 1726882542.39958: checking for max_fail_percentage 27885 1726882542.39959: done checking for max_fail_percentage 27885 1726882542.39960: checking to see if all hosts have failed and the running result is not ok 27885 1726882542.39961: done checking to see if all hosts have failed 27885 1726882542.39962: getting the remaining hosts for this loop 27885 1726882542.39963: done getting the remaining hosts for this loop 27885 1726882542.39967: getting the next task for host managed_node2 27885 1726882542.39973: done getting next task for host managed_node2 27885 1726882542.39977: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 27885 1726882542.39981: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882542.39997: getting variables 27885 1726882542.39998: in VariableManager get_vars() 27885 1726882542.40039: Calling all_inventory to load vars for managed_node2 27885 1726882542.40042: Calling groups_inventory to load vars for managed_node2 27885 1726882542.40045: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882542.40054: Calling all_plugins_play to load vars for managed_node2 27885 1726882542.40057: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882542.40060: Calling groups_plugins_play to load vars for managed_node2 27885 1726882542.41601: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882542.43222: done with get_vars() 27885 1726882542.43246: done getting variables 27885 1726882542.43315: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:35:42 -0400 (0:00:00.059) 0:00:15.075 ****** 27885 1726882542.43347: entering _queue_task() for managed_node2/fail 27885 1726882542.43654: worker is 1 (out of 1 available) 27885 1726882542.43666: exiting _queue_task() for managed_node2/fail 27885 1726882542.43677: done queuing things up, now waiting for results queue to drain 27885 1726882542.43678: waiting for pending results... 27885 1726882542.44022: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 27885 1726882542.44088: in run() - task 12673a56-9f93-3fa5-01be-00000000001d 27885 1726882542.44118: variable 'ansible_search_path' from source: unknown 27885 1726882542.44200: variable 'ansible_search_path' from source: unknown 27885 1726882542.44205: calling self._execute() 27885 1726882542.44265: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882542.44298: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882542.44302: variable 'omit' from source: magic vars 27885 1726882542.44706: variable 'ansible_distribution_major_version' from source: facts 27885 1726882542.44756: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882542.44854: variable 'network_state' from source: role '' defaults 27885 1726882542.44877: Evaluated conditional (network_state != {}): False 27885 1726882542.44887: when evaluation is False, skipping this task 27885 1726882542.44900: _execute() done 27885 1726882542.44909: dumping result to json 27885 1726882542.44973: done dumping result, returning 27885 1726882542.44978: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-3fa5-01be-00000000001d] 27885 1726882542.44986: sending task result for task 12673a56-9f93-3fa5-01be-00000000001d 27885 1726882542.45060: done sending task result for task 12673a56-9f93-3fa5-01be-00000000001d 27885 1726882542.45063: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 27885 1726882542.45129: no more pending results, returning what we have 27885 1726882542.45133: results queue empty 27885 1726882542.45134: checking for any_errors_fatal 27885 1726882542.45140: done checking for any_errors_fatal 27885 1726882542.45141: checking for max_fail_percentage 27885 1726882542.45143: done checking for max_fail_percentage 27885 1726882542.45143: checking to see if all hosts have failed and the running result is not ok 27885 1726882542.45144: done checking to see if all hosts have failed 27885 1726882542.45145: getting the remaining hosts for this loop 27885 1726882542.45147: done getting the remaining hosts for this loop 27885 1726882542.45151: getting the next task for host managed_node2 27885 1726882542.45157: done getting next task for host managed_node2 27885 1726882542.45161: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 27885 1726882542.45165: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882542.45181: getting variables 27885 1726882542.45183: in VariableManager get_vars() 27885 1726882542.45230: Calling all_inventory to load vars for managed_node2 27885 1726882542.45233: Calling groups_inventory to load vars for managed_node2 27885 1726882542.45236: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882542.45248: Calling all_plugins_play to load vars for managed_node2 27885 1726882542.45251: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882542.45254: Calling groups_plugins_play to load vars for managed_node2 27885 1726882542.46998: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882542.48626: done with get_vars() 27885 1726882542.48647: done getting variables 27885 1726882542.48715: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:35:42 -0400 (0:00:00.053) 0:00:15.129 ****** 27885 1726882542.48746: entering _queue_task() for managed_node2/fail 27885 1726882542.49225: worker is 1 (out of 1 available) 27885 1726882542.49236: exiting _queue_task() for managed_node2/fail 27885 1726882542.49247: done queuing things up, now waiting for results queue to drain 27885 1726882542.49248: waiting for pending results... 27885 1726882542.49414: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 27885 1726882542.49499: in run() - task 12673a56-9f93-3fa5-01be-00000000001e 27885 1726882542.49582: variable 'ansible_search_path' from source: unknown 27885 1726882542.49586: variable 'ansible_search_path' from source: unknown 27885 1726882542.49589: calling self._execute() 27885 1726882542.49665: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882542.49678: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882542.49702: variable 'omit' from source: magic vars 27885 1726882542.50144: variable 'ansible_distribution_major_version' from source: facts 27885 1726882542.50165: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882542.50302: variable 'network_state' from source: role '' defaults 27885 1726882542.50318: Evaluated conditional (network_state != {}): False 27885 1726882542.50353: when evaluation is False, skipping this task 27885 1726882542.50357: _execute() done 27885 1726882542.50360: dumping result to json 27885 1726882542.50363: done dumping result, returning 27885 1726882542.50366: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-3fa5-01be-00000000001e] 27885 1726882542.50369: sending task result for task 12673a56-9f93-3fa5-01be-00000000001e 27885 1726882542.50585: done sending task result for task 12673a56-9f93-3fa5-01be-00000000001e 27885 1726882542.50589: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 27885 1726882542.50640: no more pending results, returning what we have 27885 1726882542.50644: results queue empty 27885 1726882542.50645: checking for any_errors_fatal 27885 1726882542.50653: done checking for any_errors_fatal 27885 1726882542.50654: checking for max_fail_percentage 27885 1726882542.50655: done checking for max_fail_percentage 27885 1726882542.50656: checking to see if all hosts have failed and the running result is not ok 27885 1726882542.50657: done checking to see if all hosts have failed 27885 1726882542.50658: getting the remaining hosts for this loop 27885 1726882542.50659: done getting the remaining hosts for this loop 27885 1726882542.50663: getting the next task for host managed_node2 27885 1726882542.50670: done getting next task for host managed_node2 27885 1726882542.50788: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 27885 1726882542.50796: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882542.50811: getting variables 27885 1726882542.50812: in VariableManager get_vars() 27885 1726882542.50846: Calling all_inventory to load vars for managed_node2 27885 1726882542.50849: Calling groups_inventory to load vars for managed_node2 27885 1726882542.50851: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882542.50860: Calling all_plugins_play to load vars for managed_node2 27885 1726882542.50863: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882542.50866: Calling groups_plugins_play to load vars for managed_node2 27885 1726882542.52448: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882542.54079: done with get_vars() 27885 1726882542.54099: done getting variables 27885 1726882542.54137: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:35:42 -0400 (0:00:00.054) 0:00:15.184 ****** 27885 1726882542.54162: entering _queue_task() for managed_node2/fail 27885 1726882542.54363: worker is 1 (out of 1 available) 27885 1726882542.54376: exiting _queue_task() for managed_node2/fail 27885 1726882542.54392: done queuing things up, now waiting for results queue to drain 27885 1726882542.54395: waiting for pending results... 27885 1726882542.54560: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 27885 1726882542.54645: in run() - task 12673a56-9f93-3fa5-01be-00000000001f 27885 1726882542.54656: variable 'ansible_search_path' from source: unknown 27885 1726882542.54659: variable 'ansible_search_path' from source: unknown 27885 1726882542.54688: calling self._execute() 27885 1726882542.54755: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882542.54760: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882542.54768: variable 'omit' from source: magic vars 27885 1726882542.55071: variable 'ansible_distribution_major_version' from source: facts 27885 1726882542.55298: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882542.55302: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27885 1726882542.58245: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27885 1726882542.58611: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27885 1726882542.58650: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27885 1726882542.58687: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27885 1726882542.58718: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27885 1726882542.58798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882542.58833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882542.58864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882542.58910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882542.58928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882542.59025: variable 'ansible_distribution_major_version' from source: facts 27885 1726882542.59045: Evaluated conditional (ansible_distribution_major_version | int > 9): True 27885 1726882542.59247: variable 'ansible_distribution' from source: facts 27885 1726882542.59345: variable '__network_rh_distros' from source: role '' defaults 27885 1726882542.59398: Evaluated conditional (ansible_distribution in __network_rh_distros): True 27885 1726882542.60098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882542.60102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882542.60105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882542.60107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882542.60110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882542.60112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882542.60114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882542.60313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882542.60359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882542.60378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882542.60426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882542.60524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882542.60553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882542.60898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882542.60901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882542.61251: variable 'network_connections' from source: task vars 27885 1726882542.61699: variable 'interface0' from source: play vars 27885 1726882542.61703: variable 'interface0' from source: play vars 27885 1726882542.61705: variable 'interface0' from source: play vars 27885 1726882542.61708: variable 'interface0' from source: play vars 27885 1726882542.61711: variable 'interface1' from source: play vars 27885 1726882542.61730: variable 'interface1' from source: play vars 27885 1726882542.61908: variable 'interface1' from source: play vars 27885 1726882542.61969: variable 'interface1' from source: play vars 27885 1726882542.61989: variable 'network_state' from source: role '' defaults 27885 1726882542.62060: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27885 1726882542.62230: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27885 1726882542.62434: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27885 1726882542.62629: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27885 1726882542.62665: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27885 1726882542.62715: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27885 1726882542.62739: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27885 1726882542.62764: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882542.63199: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27885 1726882542.63203: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 27885 1726882542.63206: when evaluation is False, skipping this task 27885 1726882542.63208: _execute() done 27885 1726882542.63210: dumping result to json 27885 1726882542.63212: done dumping result, returning 27885 1726882542.63215: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-3fa5-01be-00000000001f] 27885 1726882542.63222: sending task result for task 12673a56-9f93-3fa5-01be-00000000001f 27885 1726882542.63300: done sending task result for task 12673a56-9f93-3fa5-01be-00000000001f 27885 1726882542.63303: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 27885 1726882542.63364: no more pending results, returning what we have 27885 1726882542.63368: results queue empty 27885 1726882542.63369: checking for any_errors_fatal 27885 1726882542.63374: done checking for any_errors_fatal 27885 1726882542.63375: checking for max_fail_percentage 27885 1726882542.63376: done checking for max_fail_percentage 27885 1726882542.63377: checking to see if all hosts have failed and the running result is not ok 27885 1726882542.63377: done checking to see if all hosts have failed 27885 1726882542.63378: getting the remaining hosts for this loop 27885 1726882542.63380: done getting the remaining hosts for this loop 27885 1726882542.63383: getting the next task for host managed_node2 27885 1726882542.63391: done getting next task for host managed_node2 27885 1726882542.63398: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 27885 1726882542.63401: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882542.63415: getting variables 27885 1726882542.63416: in VariableManager get_vars() 27885 1726882542.63455: Calling all_inventory to load vars for managed_node2 27885 1726882542.63458: Calling groups_inventory to load vars for managed_node2 27885 1726882542.63460: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882542.63468: Calling all_plugins_play to load vars for managed_node2 27885 1726882542.63470: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882542.63473: Calling groups_plugins_play to load vars for managed_node2 27885 1726882542.71540: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882542.73580: done with get_vars() 27885 1726882542.73611: done getting variables 27885 1726882542.73764: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:35:42 -0400 (0:00:00.196) 0:00:15.380 ****** 27885 1726882542.73831: entering _queue_task() for managed_node2/dnf 27885 1726882542.74191: worker is 1 (out of 1 available) 27885 1726882542.74343: exiting _queue_task() for managed_node2/dnf 27885 1726882542.74354: done queuing things up, now waiting for results queue to drain 27885 1726882542.74356: waiting for pending results... 27885 1726882542.74528: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 27885 1726882542.74659: in run() - task 12673a56-9f93-3fa5-01be-000000000020 27885 1726882542.74682: variable 'ansible_search_path' from source: unknown 27885 1726882542.74688: variable 'ansible_search_path' from source: unknown 27885 1726882542.74731: calling self._execute() 27885 1726882542.74827: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882542.74831: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882542.74842: variable 'omit' from source: magic vars 27885 1726882542.75263: variable 'ansible_distribution_major_version' from source: facts 27885 1726882542.75274: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882542.75495: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27885 1726882542.79408: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27885 1726882542.79502: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27885 1726882542.79543: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27885 1726882542.79584: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27885 1726882542.79614: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27885 1726882542.79714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882542.79742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882542.79770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882542.79820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882542.79833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882542.80135: variable 'ansible_distribution' from source: facts 27885 1726882542.80139: variable 'ansible_distribution_major_version' from source: facts 27885 1726882542.80156: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 27885 1726882542.80405: variable '__network_wireless_connections_defined' from source: role '' defaults 27885 1726882542.80783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882542.80813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882542.80905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882542.81009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882542.81029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882542.81135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882542.81176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882542.81225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882542.81343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882542.81410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882542.81601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882542.81605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882542.81608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882542.81611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882542.81613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882542.81908: variable 'network_connections' from source: task vars 27885 1726882542.81983: variable 'interface0' from source: play vars 27885 1726882542.82250: variable 'interface0' from source: play vars 27885 1726882542.82260: variable 'interface0' from source: play vars 27885 1726882542.82333: variable 'interface0' from source: play vars 27885 1726882542.82345: variable 'interface1' from source: play vars 27885 1726882542.82534: variable 'interface1' from source: play vars 27885 1726882542.82540: variable 'interface1' from source: play vars 27885 1726882542.82637: variable 'interface1' from source: play vars 27885 1726882542.82776: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27885 1726882542.83240: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27885 1726882542.83278: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27885 1726882542.83334: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27885 1726882542.83362: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27885 1726882542.83598: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27885 1726882542.83602: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27885 1726882542.83605: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882542.83608: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27885 1726882542.83612: variable '__network_team_connections_defined' from source: role '' defaults 27885 1726882542.83853: variable 'network_connections' from source: task vars 27885 1726882542.83856: variable 'interface0' from source: play vars 27885 1726882542.83926: variable 'interface0' from source: play vars 27885 1726882542.83937: variable 'interface0' from source: play vars 27885 1726882542.84009: variable 'interface0' from source: play vars 27885 1726882542.84020: variable 'interface1' from source: play vars 27885 1726882542.84086: variable 'interface1' from source: play vars 27885 1726882542.84100: variable 'interface1' from source: play vars 27885 1726882542.84164: variable 'interface1' from source: play vars 27885 1726882542.84206: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 27885 1726882542.84210: when evaluation is False, skipping this task 27885 1726882542.84213: _execute() done 27885 1726882542.84216: dumping result to json 27885 1726882542.84218: done dumping result, returning 27885 1726882542.84227: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-3fa5-01be-000000000020] 27885 1726882542.84234: sending task result for task 12673a56-9f93-3fa5-01be-000000000020 27885 1726882542.84337: done sending task result for task 12673a56-9f93-3fa5-01be-000000000020 27885 1726882542.84340: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 27885 1726882542.84429: no more pending results, returning what we have 27885 1726882542.84433: results queue empty 27885 1726882542.84434: checking for any_errors_fatal 27885 1726882542.84444: done checking for any_errors_fatal 27885 1726882542.84445: checking for max_fail_percentage 27885 1726882542.84447: done checking for max_fail_percentage 27885 1726882542.84447: checking to see if all hosts have failed and the running result is not ok 27885 1726882542.84448: done checking to see if all hosts have failed 27885 1726882542.84449: getting the remaining hosts for this loop 27885 1726882542.84451: done getting the remaining hosts for this loop 27885 1726882542.84455: getting the next task for host managed_node2 27885 1726882542.84461: done getting next task for host managed_node2 27885 1726882542.84465: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 27885 1726882542.84468: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882542.84485: getting variables 27885 1726882542.84487: in VariableManager get_vars() 27885 1726882542.84536: Calling all_inventory to load vars for managed_node2 27885 1726882542.84539: Calling groups_inventory to load vars for managed_node2 27885 1726882542.84542: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882542.84552: Calling all_plugins_play to load vars for managed_node2 27885 1726882542.84555: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882542.84558: Calling groups_plugins_play to load vars for managed_node2 27885 1726882542.87366: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882542.89074: done with get_vars() 27885 1726882542.89105: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 27885 1726882542.89194: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:35:42 -0400 (0:00:00.153) 0:00:15.534 ****** 27885 1726882542.89226: entering _queue_task() for managed_node2/yum 27885 1726882542.89228: Creating lock for yum 27885 1726882542.89587: worker is 1 (out of 1 available) 27885 1726882542.89608: exiting _queue_task() for managed_node2/yum 27885 1726882542.89622: done queuing things up, now waiting for results queue to drain 27885 1726882542.89623: waiting for pending results... 27885 1726882542.90111: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 27885 1726882542.90117: in run() - task 12673a56-9f93-3fa5-01be-000000000021 27885 1726882542.90120: variable 'ansible_search_path' from source: unknown 27885 1726882542.90122: variable 'ansible_search_path' from source: unknown 27885 1726882542.90134: calling self._execute() 27885 1726882542.90228: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882542.90235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882542.90256: variable 'omit' from source: magic vars 27885 1726882542.90635: variable 'ansible_distribution_major_version' from source: facts 27885 1726882542.90648: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882542.90837: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27885 1726882542.93443: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27885 1726882542.93505: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27885 1726882542.93558: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27885 1726882542.93596: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27885 1726882542.93619: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27885 1726882542.93703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882542.93731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882542.93767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882542.93807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882542.93820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882542.93916: variable 'ansible_distribution_major_version' from source: facts 27885 1726882542.93932: Evaluated conditional (ansible_distribution_major_version | int < 8): False 27885 1726882542.93936: when evaluation is False, skipping this task 27885 1726882542.93938: _execute() done 27885 1726882542.93941: dumping result to json 27885 1726882542.93943: done dumping result, returning 27885 1726882542.93963: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-3fa5-01be-000000000021] 27885 1726882542.93969: sending task result for task 12673a56-9f93-3fa5-01be-000000000021 27885 1726882542.94079: done sending task result for task 12673a56-9f93-3fa5-01be-000000000021 27885 1726882542.94081: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 27885 1726882542.94142: no more pending results, returning what we have 27885 1726882542.94146: results queue empty 27885 1726882542.94147: checking for any_errors_fatal 27885 1726882542.94155: done checking for any_errors_fatal 27885 1726882542.94155: checking for max_fail_percentage 27885 1726882542.94157: done checking for max_fail_percentage 27885 1726882542.94158: checking to see if all hosts have failed and the running result is not ok 27885 1726882542.94159: done checking to see if all hosts have failed 27885 1726882542.94159: getting the remaining hosts for this loop 27885 1726882542.94161: done getting the remaining hosts for this loop 27885 1726882542.94164: getting the next task for host managed_node2 27885 1726882542.94171: done getting next task for host managed_node2 27885 1726882542.94175: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 27885 1726882542.94177: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882542.94195: getting variables 27885 1726882542.94197: in VariableManager get_vars() 27885 1726882542.94235: Calling all_inventory to load vars for managed_node2 27885 1726882542.94238: Calling groups_inventory to load vars for managed_node2 27885 1726882542.94240: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882542.94249: Calling all_plugins_play to load vars for managed_node2 27885 1726882542.94251: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882542.94254: Calling groups_plugins_play to load vars for managed_node2 27885 1726882542.95178: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882542.96422: done with get_vars() 27885 1726882542.96443: done getting variables 27885 1726882542.96512: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:35:42 -0400 (0:00:00.073) 0:00:15.607 ****** 27885 1726882542.96543: entering _queue_task() for managed_node2/fail 27885 1726882542.96860: worker is 1 (out of 1 available) 27885 1726882542.96876: exiting _queue_task() for managed_node2/fail 27885 1726882542.96895: done queuing things up, now waiting for results queue to drain 27885 1726882542.96897: waiting for pending results... 27885 1726882542.97144: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 27885 1726882542.97302: in run() - task 12673a56-9f93-3fa5-01be-000000000022 27885 1726882542.97310: variable 'ansible_search_path' from source: unknown 27885 1726882542.97313: variable 'ansible_search_path' from source: unknown 27885 1726882542.97412: calling self._execute() 27885 1726882542.97417: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882542.97420: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882542.97440: variable 'omit' from source: magic vars 27885 1726882542.97789: variable 'ansible_distribution_major_version' from source: facts 27885 1726882542.97801: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882542.97915: variable '__network_wireless_connections_defined' from source: role '' defaults 27885 1726882542.98101: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27885 1726882542.99645: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27885 1726882542.99700: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27885 1726882542.99728: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27885 1726882542.99753: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27885 1726882542.99772: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27885 1726882542.99832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882542.99853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882542.99870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882542.99897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882542.99909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882542.99943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882542.99961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882542.99976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882543.00005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882543.00017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882543.00047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882543.00064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882543.00171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882543.00175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882543.00177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882543.00314: variable 'network_connections' from source: task vars 27885 1726882543.00317: variable 'interface0' from source: play vars 27885 1726882543.00498: variable 'interface0' from source: play vars 27885 1726882543.00502: variable 'interface0' from source: play vars 27885 1726882543.00504: variable 'interface0' from source: play vars 27885 1726882543.00507: variable 'interface1' from source: play vars 27885 1726882543.00529: variable 'interface1' from source: play vars 27885 1726882543.00532: variable 'interface1' from source: play vars 27885 1726882543.00582: variable 'interface1' from source: play vars 27885 1726882543.00651: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27885 1726882543.00816: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27885 1726882543.00852: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27885 1726882543.00881: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27885 1726882543.00911: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27885 1726882543.00952: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27885 1726882543.00971: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27885 1726882543.00997: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882543.01029: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27885 1726882543.01081: variable '__network_team_connections_defined' from source: role '' defaults 27885 1726882543.01304: variable 'network_connections' from source: task vars 27885 1726882543.01308: variable 'interface0' from source: play vars 27885 1726882543.01353: variable 'interface0' from source: play vars 27885 1726882543.01358: variable 'interface0' from source: play vars 27885 1726882543.01403: variable 'interface0' from source: play vars 27885 1726882543.01413: variable 'interface1' from source: play vars 27885 1726882543.01454: variable 'interface1' from source: play vars 27885 1726882543.01465: variable 'interface1' from source: play vars 27885 1726882543.01509: variable 'interface1' from source: play vars 27885 1726882543.01537: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 27885 1726882543.01540: when evaluation is False, skipping this task 27885 1726882543.01543: _execute() done 27885 1726882543.01545: dumping result to json 27885 1726882543.01548: done dumping result, returning 27885 1726882543.01555: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-3fa5-01be-000000000022] 27885 1726882543.01560: sending task result for task 12673a56-9f93-3fa5-01be-000000000022 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 27885 1726882543.01699: no more pending results, returning what we have 27885 1726882543.01703: results queue empty 27885 1726882543.01704: checking for any_errors_fatal 27885 1726882543.01710: done checking for any_errors_fatal 27885 1726882543.01711: checking for max_fail_percentage 27885 1726882543.01712: done checking for max_fail_percentage 27885 1726882543.01713: checking to see if all hosts have failed and the running result is not ok 27885 1726882543.01714: done checking to see if all hosts have failed 27885 1726882543.01714: getting the remaining hosts for this loop 27885 1726882543.01716: done getting the remaining hosts for this loop 27885 1726882543.01721: getting the next task for host managed_node2 27885 1726882543.01726: done getting next task for host managed_node2 27885 1726882543.01731: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 27885 1726882543.01734: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882543.01747: getting variables 27885 1726882543.01749: in VariableManager get_vars() 27885 1726882543.01788: Calling all_inventory to load vars for managed_node2 27885 1726882543.01801: Calling groups_inventory to load vars for managed_node2 27885 1726882543.01804: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882543.01810: done sending task result for task 12673a56-9f93-3fa5-01be-000000000022 27885 1726882543.01812: WORKER PROCESS EXITING 27885 1726882543.01820: Calling all_plugins_play to load vars for managed_node2 27885 1726882543.01822: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882543.01824: Calling groups_plugins_play to load vars for managed_node2 27885 1726882543.02627: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882543.03612: done with get_vars() 27885 1726882543.03629: done getting variables 27885 1726882543.03674: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:35:43 -0400 (0:00:00.071) 0:00:15.679 ****** 27885 1726882543.03700: entering _queue_task() for managed_node2/package 27885 1726882543.03942: worker is 1 (out of 1 available) 27885 1726882543.03956: exiting _queue_task() for managed_node2/package 27885 1726882543.03969: done queuing things up, now waiting for results queue to drain 27885 1726882543.03970: waiting for pending results... 27885 1726882543.04150: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 27885 1726882543.04243: in run() - task 12673a56-9f93-3fa5-01be-000000000023 27885 1726882543.04255: variable 'ansible_search_path' from source: unknown 27885 1726882543.04258: variable 'ansible_search_path' from source: unknown 27885 1726882543.04288: calling self._execute() 27885 1726882543.04357: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882543.04361: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882543.04369: variable 'omit' from source: magic vars 27885 1726882543.04653: variable 'ansible_distribution_major_version' from source: facts 27885 1726882543.04663: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882543.04802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27885 1726882543.04998: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27885 1726882543.05031: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27885 1726882543.05055: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27885 1726882543.05116: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27885 1726882543.05194: variable 'network_packages' from source: role '' defaults 27885 1726882543.05265: variable '__network_provider_setup' from source: role '' defaults 27885 1726882543.05275: variable '__network_service_name_default_nm' from source: role '' defaults 27885 1726882543.05332: variable '__network_service_name_default_nm' from source: role '' defaults 27885 1726882543.05339: variable '__network_packages_default_nm' from source: role '' defaults 27885 1726882543.05383: variable '__network_packages_default_nm' from source: role '' defaults 27885 1726882543.05500: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27885 1726882543.06825: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27885 1726882543.06866: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27885 1726882543.06891: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27885 1726882543.06920: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27885 1726882543.06941: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27885 1726882543.07010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882543.07036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882543.07053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882543.07079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882543.07090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882543.07125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882543.07145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882543.07163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882543.07188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882543.07203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882543.07344: variable '__network_packages_default_gobject_packages' from source: role '' defaults 27885 1726882543.07419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882543.07435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882543.07451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882543.07479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882543.07490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882543.07551: variable 'ansible_python' from source: facts 27885 1726882543.07573: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 27885 1726882543.07631: variable '__network_wpa_supplicant_required' from source: role '' defaults 27885 1726882543.07688: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 27885 1726882543.07767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882543.07788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882543.07807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882543.07831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882543.07841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882543.07872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882543.07892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882543.07914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882543.07938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882543.07948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882543.08044: variable 'network_connections' from source: task vars 27885 1726882543.08048: variable 'interface0' from source: play vars 27885 1726882543.08121: variable 'interface0' from source: play vars 27885 1726882543.08129: variable 'interface0' from source: play vars 27885 1726882543.08197: variable 'interface0' from source: play vars 27885 1726882543.08209: variable 'interface1' from source: play vars 27885 1726882543.08278: variable 'interface1' from source: play vars 27885 1726882543.08287: variable 'interface1' from source: play vars 27885 1726882543.08358: variable 'interface1' from source: play vars 27885 1726882543.08413: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27885 1726882543.08432: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27885 1726882543.08456: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882543.08477: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27885 1726882543.08516: variable '__network_wireless_connections_defined' from source: role '' defaults 27885 1726882543.08690: variable 'network_connections' from source: task vars 27885 1726882543.08697: variable 'interface0' from source: play vars 27885 1726882543.08765: variable 'interface0' from source: play vars 27885 1726882543.08775: variable 'interface0' from source: play vars 27885 1726882543.08843: variable 'interface0' from source: play vars 27885 1726882543.08853: variable 'interface1' from source: play vars 27885 1726882543.08926: variable 'interface1' from source: play vars 27885 1726882543.08933: variable 'interface1' from source: play vars 27885 1726882543.09003: variable 'interface1' from source: play vars 27885 1726882543.09042: variable '__network_packages_default_wireless' from source: role '' defaults 27885 1726882543.09096: variable '__network_wireless_connections_defined' from source: role '' defaults 27885 1726882543.09286: variable 'network_connections' from source: task vars 27885 1726882543.09289: variable 'interface0' from source: play vars 27885 1726882543.09340: variable 'interface0' from source: play vars 27885 1726882543.09345: variable 'interface0' from source: play vars 27885 1726882543.09388: variable 'interface0' from source: play vars 27885 1726882543.09401: variable 'interface1' from source: play vars 27885 1726882543.09447: variable 'interface1' from source: play vars 27885 1726882543.09452: variable 'interface1' from source: play vars 27885 1726882543.09498: variable 'interface1' from source: play vars 27885 1726882543.09519: variable '__network_packages_default_team' from source: role '' defaults 27885 1726882543.09572: variable '__network_team_connections_defined' from source: role '' defaults 27885 1726882543.09765: variable 'network_connections' from source: task vars 27885 1726882543.09768: variable 'interface0' from source: play vars 27885 1726882543.09817: variable 'interface0' from source: play vars 27885 1726882543.09822: variable 'interface0' from source: play vars 27885 1726882543.09868: variable 'interface0' from source: play vars 27885 1726882543.09876: variable 'interface1' from source: play vars 27885 1726882543.09924: variable 'interface1' from source: play vars 27885 1726882543.09929: variable 'interface1' from source: play vars 27885 1726882543.09975: variable 'interface1' from source: play vars 27885 1726882543.10025: variable '__network_service_name_default_initscripts' from source: role '' defaults 27885 1726882543.10066: variable '__network_service_name_default_initscripts' from source: role '' defaults 27885 1726882543.10071: variable '__network_packages_default_initscripts' from source: role '' defaults 27885 1726882543.10119: variable '__network_packages_default_initscripts' from source: role '' defaults 27885 1726882543.10253: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 27885 1726882543.10558: variable 'network_connections' from source: task vars 27885 1726882543.10561: variable 'interface0' from source: play vars 27885 1726882543.10607: variable 'interface0' from source: play vars 27885 1726882543.10612: variable 'interface0' from source: play vars 27885 1726882543.10656: variable 'interface0' from source: play vars 27885 1726882543.10664: variable 'interface1' from source: play vars 27885 1726882543.10708: variable 'interface1' from source: play vars 27885 1726882543.10714: variable 'interface1' from source: play vars 27885 1726882543.10757: variable 'interface1' from source: play vars 27885 1726882543.10766: variable 'ansible_distribution' from source: facts 27885 1726882543.10769: variable '__network_rh_distros' from source: role '' defaults 27885 1726882543.10774: variable 'ansible_distribution_major_version' from source: facts 27885 1726882543.10792: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 27885 1726882543.10901: variable 'ansible_distribution' from source: facts 27885 1726882543.10904: variable '__network_rh_distros' from source: role '' defaults 27885 1726882543.10909: variable 'ansible_distribution_major_version' from source: facts 27885 1726882543.10920: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 27885 1726882543.11027: variable 'ansible_distribution' from source: facts 27885 1726882543.11031: variable '__network_rh_distros' from source: role '' defaults 27885 1726882543.11034: variable 'ansible_distribution_major_version' from source: facts 27885 1726882543.11061: variable 'network_provider' from source: set_fact 27885 1726882543.11074: variable 'ansible_facts' from source: unknown 27885 1726882543.11512: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 27885 1726882543.11516: when evaluation is False, skipping this task 27885 1726882543.11519: _execute() done 27885 1726882543.11521: dumping result to json 27885 1726882543.11524: done dumping result, returning 27885 1726882543.11530: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-3fa5-01be-000000000023] 27885 1726882543.11535: sending task result for task 12673a56-9f93-3fa5-01be-000000000023 27885 1726882543.11625: done sending task result for task 12673a56-9f93-3fa5-01be-000000000023 27885 1726882543.11627: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 27885 1726882543.11671: no more pending results, returning what we have 27885 1726882543.11675: results queue empty 27885 1726882543.11680: checking for any_errors_fatal 27885 1726882543.11687: done checking for any_errors_fatal 27885 1726882543.11688: checking for max_fail_percentage 27885 1726882543.11690: done checking for max_fail_percentage 27885 1726882543.11690: checking to see if all hosts have failed and the running result is not ok 27885 1726882543.11691: done checking to see if all hosts have failed 27885 1726882543.11692: getting the remaining hosts for this loop 27885 1726882543.11695: done getting the remaining hosts for this loop 27885 1726882543.11699: getting the next task for host managed_node2 27885 1726882543.11705: done getting next task for host managed_node2 27885 1726882543.11708: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 27885 1726882543.11711: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882543.11725: getting variables 27885 1726882543.11727: in VariableManager get_vars() 27885 1726882543.11766: Calling all_inventory to load vars for managed_node2 27885 1726882543.11769: Calling groups_inventory to load vars for managed_node2 27885 1726882543.11771: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882543.11780: Calling all_plugins_play to load vars for managed_node2 27885 1726882543.11782: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882543.11785: Calling groups_plugins_play to load vars for managed_node2 27885 1726882543.12584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882543.13450: done with get_vars() 27885 1726882543.13466: done getting variables 27885 1726882543.13511: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:35:43 -0400 (0:00:00.098) 0:00:15.777 ****** 27885 1726882543.13536: entering _queue_task() for managed_node2/package 27885 1726882543.13760: worker is 1 (out of 1 available) 27885 1726882543.13773: exiting _queue_task() for managed_node2/package 27885 1726882543.13786: done queuing things up, now waiting for results queue to drain 27885 1726882543.13787: waiting for pending results... 27885 1726882543.13967: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 27885 1726882543.14059: in run() - task 12673a56-9f93-3fa5-01be-000000000024 27885 1726882543.14071: variable 'ansible_search_path' from source: unknown 27885 1726882543.14075: variable 'ansible_search_path' from source: unknown 27885 1726882543.14106: calling self._execute() 27885 1726882543.14172: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882543.14177: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882543.14186: variable 'omit' from source: magic vars 27885 1726882543.14456: variable 'ansible_distribution_major_version' from source: facts 27885 1726882543.14466: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882543.14552: variable 'network_state' from source: role '' defaults 27885 1726882543.14556: Evaluated conditional (network_state != {}): False 27885 1726882543.14558: when evaluation is False, skipping this task 27885 1726882543.14563: _execute() done 27885 1726882543.14567: dumping result to json 27885 1726882543.14570: done dumping result, returning 27885 1726882543.14577: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-3fa5-01be-000000000024] 27885 1726882543.14588: sending task result for task 12673a56-9f93-3fa5-01be-000000000024 27885 1726882543.14674: done sending task result for task 12673a56-9f93-3fa5-01be-000000000024 27885 1726882543.14677: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 27885 1726882543.14725: no more pending results, returning what we have 27885 1726882543.14729: results queue empty 27885 1726882543.14730: checking for any_errors_fatal 27885 1726882543.14739: done checking for any_errors_fatal 27885 1726882543.14739: checking for max_fail_percentage 27885 1726882543.14741: done checking for max_fail_percentage 27885 1726882543.14742: checking to see if all hosts have failed and the running result is not ok 27885 1726882543.14743: done checking to see if all hosts have failed 27885 1726882543.14743: getting the remaining hosts for this loop 27885 1726882543.14745: done getting the remaining hosts for this loop 27885 1726882543.14748: getting the next task for host managed_node2 27885 1726882543.14754: done getting next task for host managed_node2 27885 1726882543.14757: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 27885 1726882543.14760: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882543.14774: getting variables 27885 1726882543.14775: in VariableManager get_vars() 27885 1726882543.14817: Calling all_inventory to load vars for managed_node2 27885 1726882543.14820: Calling groups_inventory to load vars for managed_node2 27885 1726882543.14822: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882543.14830: Calling all_plugins_play to load vars for managed_node2 27885 1726882543.14832: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882543.14835: Calling groups_plugins_play to load vars for managed_node2 27885 1726882543.15685: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882543.16549: done with get_vars() 27885 1726882543.16563: done getting variables 27885 1726882543.16608: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:35:43 -0400 (0:00:00.030) 0:00:15.808 ****** 27885 1726882543.16630: entering _queue_task() for managed_node2/package 27885 1726882543.16842: worker is 1 (out of 1 available) 27885 1726882543.16854: exiting _queue_task() for managed_node2/package 27885 1726882543.16867: done queuing things up, now waiting for results queue to drain 27885 1726882543.16869: waiting for pending results... 27885 1726882543.17036: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 27885 1726882543.17124: in run() - task 12673a56-9f93-3fa5-01be-000000000025 27885 1726882543.17136: variable 'ansible_search_path' from source: unknown 27885 1726882543.17140: variable 'ansible_search_path' from source: unknown 27885 1726882543.17166: calling self._execute() 27885 1726882543.17234: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882543.17238: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882543.17247: variable 'omit' from source: magic vars 27885 1726882543.17515: variable 'ansible_distribution_major_version' from source: facts 27885 1726882543.17525: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882543.17607: variable 'network_state' from source: role '' defaults 27885 1726882543.17615: Evaluated conditional (network_state != {}): False 27885 1726882543.17619: when evaluation is False, skipping this task 27885 1726882543.17621: _execute() done 27885 1726882543.17624: dumping result to json 27885 1726882543.17629: done dumping result, returning 27885 1726882543.17637: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-3fa5-01be-000000000025] 27885 1726882543.17640: sending task result for task 12673a56-9f93-3fa5-01be-000000000025 27885 1726882543.17729: done sending task result for task 12673a56-9f93-3fa5-01be-000000000025 27885 1726882543.17732: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 27885 1726882543.17795: no more pending results, returning what we have 27885 1726882543.17799: results queue empty 27885 1726882543.17800: checking for any_errors_fatal 27885 1726882543.17807: done checking for any_errors_fatal 27885 1726882543.17808: checking for max_fail_percentage 27885 1726882543.17809: done checking for max_fail_percentage 27885 1726882543.17810: checking to see if all hosts have failed and the running result is not ok 27885 1726882543.17811: done checking to see if all hosts have failed 27885 1726882543.17811: getting the remaining hosts for this loop 27885 1726882543.17813: done getting the remaining hosts for this loop 27885 1726882543.17815: getting the next task for host managed_node2 27885 1726882543.17821: done getting next task for host managed_node2 27885 1726882543.17824: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 27885 1726882543.17826: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882543.17839: getting variables 27885 1726882543.17841: in VariableManager get_vars() 27885 1726882543.17872: Calling all_inventory to load vars for managed_node2 27885 1726882543.17874: Calling groups_inventory to load vars for managed_node2 27885 1726882543.17876: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882543.17884: Calling all_plugins_play to load vars for managed_node2 27885 1726882543.17886: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882543.17889: Calling groups_plugins_play to load vars for managed_node2 27885 1726882543.18624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882543.19499: done with get_vars() 27885 1726882543.19518: done getting variables 27885 1726882543.19588: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:35:43 -0400 (0:00:00.029) 0:00:15.838 ****** 27885 1726882543.19615: entering _queue_task() for managed_node2/service 27885 1726882543.19618: Creating lock for service 27885 1726882543.19858: worker is 1 (out of 1 available) 27885 1726882543.19871: exiting _queue_task() for managed_node2/service 27885 1726882543.19885: done queuing things up, now waiting for results queue to drain 27885 1726882543.19886: waiting for pending results... 27885 1726882543.20059: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 27885 1726882543.20149: in run() - task 12673a56-9f93-3fa5-01be-000000000026 27885 1726882543.20162: variable 'ansible_search_path' from source: unknown 27885 1726882543.20166: variable 'ansible_search_path' from source: unknown 27885 1726882543.20197: calling self._execute() 27885 1726882543.20261: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882543.20266: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882543.20275: variable 'omit' from source: magic vars 27885 1726882543.20549: variable 'ansible_distribution_major_version' from source: facts 27885 1726882543.20560: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882543.20638: variable '__network_wireless_connections_defined' from source: role '' defaults 27885 1726882543.20775: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27885 1726882543.22434: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27885 1726882543.22476: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27885 1726882543.22512: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27885 1726882543.22539: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27885 1726882543.22559: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27885 1726882543.22617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882543.22640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882543.22657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882543.22683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882543.22696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882543.22728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882543.22747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882543.22764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882543.22792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882543.22803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882543.22831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882543.22852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882543.22868: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882543.22896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882543.22906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882543.23012: variable 'network_connections' from source: task vars 27885 1726882543.23022: variable 'interface0' from source: play vars 27885 1726882543.23072: variable 'interface0' from source: play vars 27885 1726882543.23080: variable 'interface0' from source: play vars 27885 1726882543.23124: variable 'interface0' from source: play vars 27885 1726882543.23134: variable 'interface1' from source: play vars 27885 1726882543.23177: variable 'interface1' from source: play vars 27885 1726882543.23180: variable 'interface1' from source: play vars 27885 1726882543.23226: variable 'interface1' from source: play vars 27885 1726882543.23274: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27885 1726882543.23381: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27885 1726882543.23411: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27885 1726882543.23442: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27885 1726882543.23462: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27885 1726882543.23495: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27885 1726882543.23511: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27885 1726882543.23531: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882543.23548: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27885 1726882543.23597: variable '__network_team_connections_defined' from source: role '' defaults 27885 1726882543.23737: variable 'network_connections' from source: task vars 27885 1726882543.23740: variable 'interface0' from source: play vars 27885 1726882543.23780: variable 'interface0' from source: play vars 27885 1726882543.23785: variable 'interface0' from source: play vars 27885 1726882543.23830: variable 'interface0' from source: play vars 27885 1726882543.23840: variable 'interface1' from source: play vars 27885 1726882543.23881: variable 'interface1' from source: play vars 27885 1726882543.23897: variable 'interface1' from source: play vars 27885 1726882543.23930: variable 'interface1' from source: play vars 27885 1726882543.23958: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 27885 1726882543.23962: when evaluation is False, skipping this task 27885 1726882543.23965: _execute() done 27885 1726882543.23967: dumping result to json 27885 1726882543.23969: done dumping result, returning 27885 1726882543.23976: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-3fa5-01be-000000000026] 27885 1726882543.23981: sending task result for task 12673a56-9f93-3fa5-01be-000000000026 27885 1726882543.24069: done sending task result for task 12673a56-9f93-3fa5-01be-000000000026 27885 1726882543.24072: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 27885 1726882543.24119: no more pending results, returning what we have 27885 1726882543.24122: results queue empty 27885 1726882543.24123: checking for any_errors_fatal 27885 1726882543.24130: done checking for any_errors_fatal 27885 1726882543.24131: checking for max_fail_percentage 27885 1726882543.24132: done checking for max_fail_percentage 27885 1726882543.24133: checking to see if all hosts have failed and the running result is not ok 27885 1726882543.24134: done checking to see if all hosts have failed 27885 1726882543.24134: getting the remaining hosts for this loop 27885 1726882543.24136: done getting the remaining hosts for this loop 27885 1726882543.24140: getting the next task for host managed_node2 27885 1726882543.24146: done getting next task for host managed_node2 27885 1726882543.24150: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 27885 1726882543.24152: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882543.24167: getting variables 27885 1726882543.24168: in VariableManager get_vars() 27885 1726882543.24218: Calling all_inventory to load vars for managed_node2 27885 1726882543.24221: Calling groups_inventory to load vars for managed_node2 27885 1726882543.24223: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882543.24232: Calling all_plugins_play to load vars for managed_node2 27885 1726882543.24234: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882543.24237: Calling groups_plugins_play to load vars for managed_node2 27885 1726882543.25145: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882543.26009: done with get_vars() 27885 1726882543.26027: done getting variables 27885 1726882543.26069: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:35:43 -0400 (0:00:00.064) 0:00:15.903 ****** 27885 1726882543.26095: entering _queue_task() for managed_node2/service 27885 1726882543.26317: worker is 1 (out of 1 available) 27885 1726882543.26330: exiting _queue_task() for managed_node2/service 27885 1726882543.26343: done queuing things up, now waiting for results queue to drain 27885 1726882543.26345: waiting for pending results... 27885 1726882543.26508: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 27885 1726882543.26600: in run() - task 12673a56-9f93-3fa5-01be-000000000027 27885 1726882543.26611: variable 'ansible_search_path' from source: unknown 27885 1726882543.26614: variable 'ansible_search_path' from source: unknown 27885 1726882543.26642: calling self._execute() 27885 1726882543.26712: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882543.26717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882543.26725: variable 'omit' from source: magic vars 27885 1726882543.26995: variable 'ansible_distribution_major_version' from source: facts 27885 1726882543.27003: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882543.27109: variable 'network_provider' from source: set_fact 27885 1726882543.27115: variable 'network_state' from source: role '' defaults 27885 1726882543.27121: Evaluated conditional (network_provider == "nm" or network_state != {}): True 27885 1726882543.27130: variable 'omit' from source: magic vars 27885 1726882543.27160: variable 'omit' from source: magic vars 27885 1726882543.27181: variable 'network_service_name' from source: role '' defaults 27885 1726882543.27237: variable 'network_service_name' from source: role '' defaults 27885 1726882543.27305: variable '__network_provider_setup' from source: role '' defaults 27885 1726882543.27309: variable '__network_service_name_default_nm' from source: role '' defaults 27885 1726882543.27356: variable '__network_service_name_default_nm' from source: role '' defaults 27885 1726882543.27364: variable '__network_packages_default_nm' from source: role '' defaults 27885 1726882543.27410: variable '__network_packages_default_nm' from source: role '' defaults 27885 1726882543.27552: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27885 1726882543.28939: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27885 1726882543.28995: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27885 1726882543.29019: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27885 1726882543.29044: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27885 1726882543.29063: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27885 1726882543.29122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882543.29143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882543.29249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882543.29252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882543.29255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882543.29257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882543.29499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882543.29502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882543.29504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882543.29506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882543.29669: variable '__network_packages_default_gobject_packages' from source: role '' defaults 27885 1726882543.29800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882543.29831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882543.29873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882543.29924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882543.29956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882543.30127: variable 'ansible_python' from source: facts 27885 1726882543.30186: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 27885 1726882543.30284: variable '__network_wpa_supplicant_required' from source: role '' defaults 27885 1726882543.30398: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 27885 1726882543.30526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882543.30558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882543.30620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882543.30678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882543.30682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882543.30802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882543.30813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882543.30815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882543.30819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882543.30821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882543.30977: variable 'network_connections' from source: task vars 27885 1726882543.30980: variable 'interface0' from source: play vars 27885 1726882543.31042: variable 'interface0' from source: play vars 27885 1726882543.31063: variable 'interface0' from source: play vars 27885 1726882543.31200: variable 'interface0' from source: play vars 27885 1726882543.31203: variable 'interface1' from source: play vars 27885 1726882543.31253: variable 'interface1' from source: play vars 27885 1726882543.31265: variable 'interface1' from source: play vars 27885 1726882543.31354: variable 'interface1' from source: play vars 27885 1726882543.31484: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27885 1726882543.31722: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27885 1726882543.31770: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27885 1726882543.31864: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27885 1726882543.31911: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27885 1726882543.32125: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27885 1726882543.32195: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27885 1726882543.32232: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882543.32582: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27885 1726882543.32585: variable '__network_wireless_connections_defined' from source: role '' defaults 27885 1726882543.32975: variable 'network_connections' from source: task vars 27885 1726882543.32981: variable 'interface0' from source: play vars 27885 1726882543.33071: variable 'interface0' from source: play vars 27885 1726882543.33082: variable 'interface0' from source: play vars 27885 1726882543.33174: variable 'interface0' from source: play vars 27885 1726882543.33208: variable 'interface1' from source: play vars 27885 1726882543.33292: variable 'interface1' from source: play vars 27885 1726882543.33499: variable 'interface1' from source: play vars 27885 1726882543.33502: variable 'interface1' from source: play vars 27885 1726882543.33505: variable '__network_packages_default_wireless' from source: role '' defaults 27885 1726882543.33530: variable '__network_wireless_connections_defined' from source: role '' defaults 27885 1726882543.33883: variable 'network_connections' from source: task vars 27885 1726882543.33886: variable 'interface0' from source: play vars 27885 1726882543.34035: variable 'interface0' from source: play vars 27885 1726882543.34042: variable 'interface0' from source: play vars 27885 1726882543.34138: variable 'interface0' from source: play vars 27885 1726882543.34172: variable 'interface1' from source: play vars 27885 1726882543.34289: variable 'interface1' from source: play vars 27885 1726882543.34304: variable 'interface1' from source: play vars 27885 1726882543.34386: variable 'interface1' from source: play vars 27885 1726882543.34415: variable '__network_packages_default_team' from source: role '' defaults 27885 1726882543.34470: variable '__network_team_connections_defined' from source: role '' defaults 27885 1726882543.34656: variable 'network_connections' from source: task vars 27885 1726882543.34660: variable 'interface0' from source: play vars 27885 1726882543.34712: variable 'interface0' from source: play vars 27885 1726882543.34717: variable 'interface0' from source: play vars 27885 1726882543.34767: variable 'interface0' from source: play vars 27885 1726882543.34776: variable 'interface1' from source: play vars 27885 1726882543.34828: variable 'interface1' from source: play vars 27885 1726882543.34833: variable 'interface1' from source: play vars 27885 1726882543.34885: variable 'interface1' from source: play vars 27885 1726882543.34930: variable '__network_service_name_default_initscripts' from source: role '' defaults 27885 1726882543.34972: variable '__network_service_name_default_initscripts' from source: role '' defaults 27885 1726882543.34978: variable '__network_packages_default_initscripts' from source: role '' defaults 27885 1726882543.35024: variable '__network_packages_default_initscripts' from source: role '' defaults 27885 1726882543.35172: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 27885 1726882543.35473: variable 'network_connections' from source: task vars 27885 1726882543.35477: variable 'interface0' from source: play vars 27885 1726882543.35523: variable 'interface0' from source: play vars 27885 1726882543.35529: variable 'interface0' from source: play vars 27885 1726882543.35568: variable 'interface0' from source: play vars 27885 1726882543.35577: variable 'interface1' from source: play vars 27885 1726882543.35624: variable 'interface1' from source: play vars 27885 1726882543.35629: variable 'interface1' from source: play vars 27885 1726882543.35669: variable 'interface1' from source: play vars 27885 1726882543.35679: variable 'ansible_distribution' from source: facts 27885 1726882543.35681: variable '__network_rh_distros' from source: role '' defaults 27885 1726882543.35687: variable 'ansible_distribution_major_version' from source: facts 27885 1726882543.35708: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 27885 1726882543.35820: variable 'ansible_distribution' from source: facts 27885 1726882543.35823: variable '__network_rh_distros' from source: role '' defaults 27885 1726882543.35826: variable 'ansible_distribution_major_version' from source: facts 27885 1726882543.35838: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 27885 1726882543.35948: variable 'ansible_distribution' from source: facts 27885 1726882543.35953: variable '__network_rh_distros' from source: role '' defaults 27885 1726882543.35955: variable 'ansible_distribution_major_version' from source: facts 27885 1726882543.36030: variable 'network_provider' from source: set_fact 27885 1726882543.36060: variable 'omit' from source: magic vars 27885 1726882543.36100: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882543.36214: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882543.36217: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882543.36219: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882543.36225: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882543.36228: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882543.36230: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882543.36232: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882543.36337: Set connection var ansible_pipelining to False 27885 1726882543.36348: Set connection var ansible_connection to ssh 27885 1726882543.36354: Set connection var ansible_timeout to 10 27885 1726882543.36356: Set connection var ansible_shell_type to sh 27885 1726882543.36359: Set connection var ansible_shell_executable to /bin/sh 27885 1726882543.36360: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882543.36461: variable 'ansible_shell_executable' from source: unknown 27885 1726882543.36464: variable 'ansible_connection' from source: unknown 27885 1726882543.36466: variable 'ansible_module_compression' from source: unknown 27885 1726882543.36468: variable 'ansible_shell_type' from source: unknown 27885 1726882543.36469: variable 'ansible_shell_executable' from source: unknown 27885 1726882543.36471: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882543.36473: variable 'ansible_pipelining' from source: unknown 27885 1726882543.36500: variable 'ansible_timeout' from source: unknown 27885 1726882543.36505: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882543.36588: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882543.36676: variable 'omit' from source: magic vars 27885 1726882543.36679: starting attempt loop 27885 1726882543.36682: running the handler 27885 1726882543.37000: variable 'ansible_facts' from source: unknown 27885 1726882543.37528: _low_level_execute_command(): starting 27885 1726882543.37535: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27885 1726882543.38296: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882543.38312: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882543.38324: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882543.38425: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882543.40117: stdout chunk (state=3): >>>/root <<< 27885 1726882543.40249: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882543.40252: stdout chunk (state=3): >>><<< 27885 1726882543.40254: stderr chunk (state=3): >>><<< 27885 1726882543.40361: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882543.40366: _low_level_execute_command(): starting 27885 1726882543.40370: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882543.4027245-28637-246277185026214 `" && echo ansible-tmp-1726882543.4027245-28637-246277185026214="` echo /root/.ansible/tmp/ansible-tmp-1726882543.4027245-28637-246277185026214 `" ) && sleep 0' 27885 1726882543.41006: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882543.41025: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882543.41050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882543.41110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882543.41177: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882543.41210: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882543.41232: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882543.41331: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882543.43600: stdout chunk (state=3): >>>ansible-tmp-1726882543.4027245-28637-246277185026214=/root/.ansible/tmp/ansible-tmp-1726882543.4027245-28637-246277185026214 <<< 27885 1726882543.43604: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882543.43606: stdout chunk (state=3): >>><<< 27885 1726882543.43608: stderr chunk (state=3): >>><<< 27885 1726882543.43611: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882543.4027245-28637-246277185026214=/root/.ansible/tmp/ansible-tmp-1726882543.4027245-28637-246277185026214 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882543.43620: variable 'ansible_module_compression' from source: unknown 27885 1726882543.43623: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 27885 1726882543.43626: ANSIBALLZ: Acquiring lock 27885 1726882543.43628: ANSIBALLZ: Lock acquired: 140560087758944 27885 1726882543.43629: ANSIBALLZ: Creating module 27885 1726882543.86233: ANSIBALLZ: Writing module into payload 27885 1726882543.86420: ANSIBALLZ: Writing module 27885 1726882543.86446: ANSIBALLZ: Renaming module 27885 1726882543.86453: ANSIBALLZ: Done creating module 27885 1726882543.86475: variable 'ansible_facts' from source: unknown 27885 1726882543.86683: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882543.4027245-28637-246277185026214/AnsiballZ_systemd.py 27885 1726882543.86834: Sending initial data 27885 1726882543.86837: Sent initial data (156 bytes) 27885 1726882543.87591: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882543.87617: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882543.87637: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882543.87645: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882543.87732: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882543.89384: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 27885 1726882543.89412: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27885 1726882543.89467: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27885 1726882543.89553: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmphwax4x23 /root/.ansible/tmp/ansible-tmp-1726882543.4027245-28637-246277185026214/AnsiballZ_systemd.py <<< 27885 1726882543.89556: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882543.4027245-28637-246277185026214/AnsiballZ_systemd.py" <<< 27885 1726882543.89638: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmphwax4x23" to remote "/root/.ansible/tmp/ansible-tmp-1726882543.4027245-28637-246277185026214/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882543.4027245-28637-246277185026214/AnsiballZ_systemd.py" <<< 27885 1726882543.91152: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882543.91196: stderr chunk (state=3): >>><<< 27885 1726882543.91199: stdout chunk (state=3): >>><<< 27885 1726882543.91257: done transferring module to remote 27885 1726882543.91260: _low_level_execute_command(): starting 27885 1726882543.91268: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882543.4027245-28637-246277185026214/ /root/.ansible/tmp/ansible-tmp-1726882543.4027245-28637-246277185026214/AnsiballZ_systemd.py && sleep 0' 27885 1726882543.91930: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882543.91958: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882543.92011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882543.92072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882543.92083: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882543.92166: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882543.92218: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882543.92283: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882543.94018: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882543.94043: stderr chunk (state=3): >>><<< 27885 1726882543.94046: stdout chunk (state=3): >>><<< 27885 1726882543.94059: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882543.94062: _low_level_execute_command(): starting 27885 1726882543.94068: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882543.4027245-28637-246277185026214/AnsiballZ_systemd.py && sleep 0' 27885 1726882543.94744: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882543.94748: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 27885 1726882543.94829: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882543.94865: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882543.94880: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882543.94914: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882543.95023: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882544.23707: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6947", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainStartTimestampMonotonic": "260736749", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainHandoffTimestampMonotonic": "260753620", "ExecMainPID": "6947", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl cal<<< 27885 1726882544.23727: stdout chunk (state=3): >>>l org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4694016", "MemoryPeak": "7507968", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3311513600", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1313393000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "Priv<<< 27885 1726882544.23748: stdout chunk (state=3): >>>ateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.target shutdown.target multi-user.target", "After": "basic.target cloud-init-local.service dbus-broker.service system.slice network-pre.target systemd-journald.socket sysinit.target dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:25 EDT", "StateChangeTimestampMonotonic": "355353338", "InactiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveExitTimestampMonotonic": "260738404", "ActiveEnterTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ActiveEnterTimestampMonotonic": "260824743", "ActiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ActiveExitTimestampMonotonic": "260719627", "InactiveEnterTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveEnterTimestampMonotonic": "260732561", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ConditionTimestampMonotonic": "260735742", "AssertTimestamp": "Fri 2024-09-20 21:27:50 EDT", "AssertTimestampMonotonic": "260735751", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "02f7cf7a90d5486687dc572c7e50e205", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 27885 1726882544.25467: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 27885 1726882544.25505: stderr chunk (state=3): >>><<< 27885 1726882544.25507: stdout chunk (state=3): >>><<< 27885 1726882544.25524: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6947", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainStartTimestampMonotonic": "260736749", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainHandoffTimestampMonotonic": "260753620", "ExecMainPID": "6947", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4694016", "MemoryPeak": "7507968", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3311513600", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1313393000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.target shutdown.target multi-user.target", "After": "basic.target cloud-init-local.service dbus-broker.service system.slice network-pre.target systemd-journald.socket sysinit.target dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:25 EDT", "StateChangeTimestampMonotonic": "355353338", "InactiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveExitTimestampMonotonic": "260738404", "ActiveEnterTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ActiveEnterTimestampMonotonic": "260824743", "ActiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ActiveExitTimestampMonotonic": "260719627", "InactiveEnterTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveEnterTimestampMonotonic": "260732561", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ConditionTimestampMonotonic": "260735742", "AssertTimestamp": "Fri 2024-09-20 21:27:50 EDT", "AssertTimestampMonotonic": "260735751", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "02f7cf7a90d5486687dc572c7e50e205", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 27885 1726882544.25639: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882543.4027245-28637-246277185026214/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27885 1726882544.25647: _low_level_execute_command(): starting 27885 1726882544.25650: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882543.4027245-28637-246277185026214/ > /dev/null 2>&1 && sleep 0' 27885 1726882544.26079: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882544.26118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882544.26121: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 27885 1726882544.26124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882544.26126: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration <<< 27885 1726882544.26128: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882544.26129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882544.26176: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882544.26180: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882544.26184: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882544.26244: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882544.28027: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882544.28053: stderr chunk (state=3): >>><<< 27885 1726882544.28056: stdout chunk (state=3): >>><<< 27885 1726882544.28066: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882544.28072: handler run complete 27885 1726882544.28115: attempt loop complete, returning result 27885 1726882544.28118: _execute() done 27885 1726882544.28120: dumping result to json 27885 1726882544.28132: done dumping result, returning 27885 1726882544.28141: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-3fa5-01be-000000000027] 27885 1726882544.28148: sending task result for task 12673a56-9f93-3fa5-01be-000000000027 27885 1726882544.28364: done sending task result for task 12673a56-9f93-3fa5-01be-000000000027 27885 1726882544.28367: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 27885 1726882544.28418: no more pending results, returning what we have 27885 1726882544.28422: results queue empty 27885 1726882544.28423: checking for any_errors_fatal 27885 1726882544.28431: done checking for any_errors_fatal 27885 1726882544.28432: checking for max_fail_percentage 27885 1726882544.28433: done checking for max_fail_percentage 27885 1726882544.28434: checking to see if all hosts have failed and the running result is not ok 27885 1726882544.28434: done checking to see if all hosts have failed 27885 1726882544.28435: getting the remaining hosts for this loop 27885 1726882544.28437: done getting the remaining hosts for this loop 27885 1726882544.28440: getting the next task for host managed_node2 27885 1726882544.28445: done getting next task for host managed_node2 27885 1726882544.28448: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 27885 1726882544.28452: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882544.28461: getting variables 27885 1726882544.28463: in VariableManager get_vars() 27885 1726882544.28500: Calling all_inventory to load vars for managed_node2 27885 1726882544.28506: Calling groups_inventory to load vars for managed_node2 27885 1726882544.28509: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882544.28518: Calling all_plugins_play to load vars for managed_node2 27885 1726882544.28521: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882544.28523: Calling groups_plugins_play to load vars for managed_node2 27885 1726882544.29317: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882544.30247: done with get_vars() 27885 1726882544.30262: done getting variables 27885 1726882544.30306: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:35:44 -0400 (0:00:01.042) 0:00:16.945 ****** 27885 1726882544.30329: entering _queue_task() for managed_node2/service 27885 1726882544.30545: worker is 1 (out of 1 available) 27885 1726882544.30561: exiting _queue_task() for managed_node2/service 27885 1726882544.30573: done queuing things up, now waiting for results queue to drain 27885 1726882544.30575: waiting for pending results... 27885 1726882544.30745: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 27885 1726882544.30834: in run() - task 12673a56-9f93-3fa5-01be-000000000028 27885 1726882544.30847: variable 'ansible_search_path' from source: unknown 27885 1726882544.30850: variable 'ansible_search_path' from source: unknown 27885 1726882544.30878: calling self._execute() 27885 1726882544.30948: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882544.30953: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882544.30961: variable 'omit' from source: magic vars 27885 1726882544.31232: variable 'ansible_distribution_major_version' from source: facts 27885 1726882544.31240: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882544.31320: variable 'network_provider' from source: set_fact 27885 1726882544.31324: Evaluated conditional (network_provider == "nm"): True 27885 1726882544.31387: variable '__network_wpa_supplicant_required' from source: role '' defaults 27885 1726882544.31447: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 27885 1726882544.31562: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27885 1726882544.32939: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27885 1726882544.32985: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27885 1726882544.33011: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27885 1726882544.33037: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27885 1726882544.33057: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27885 1726882544.33117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882544.33147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882544.33165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882544.33190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882544.33208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882544.33241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882544.33258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882544.33275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882544.33305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882544.33317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882544.33344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882544.33361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882544.33377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882544.33406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882544.33418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882544.33510: variable 'network_connections' from source: task vars 27885 1726882544.33520: variable 'interface0' from source: play vars 27885 1726882544.33570: variable 'interface0' from source: play vars 27885 1726882544.33578: variable 'interface0' from source: play vars 27885 1726882544.33623: variable 'interface0' from source: play vars 27885 1726882544.33633: variable 'interface1' from source: play vars 27885 1726882544.33675: variable 'interface1' from source: play vars 27885 1726882544.33680: variable 'interface1' from source: play vars 27885 1726882544.33725: variable 'interface1' from source: play vars 27885 1726882544.33777: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27885 1726882544.33886: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27885 1726882544.33916: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27885 1726882544.33937: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27885 1726882544.33958: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27885 1726882544.33990: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27885 1726882544.34009: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27885 1726882544.34026: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882544.34043: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27885 1726882544.34080: variable '__network_wireless_connections_defined' from source: role '' defaults 27885 1726882544.34237: variable 'network_connections' from source: task vars 27885 1726882544.34241: variable 'interface0' from source: play vars 27885 1726882544.34283: variable 'interface0' from source: play vars 27885 1726882544.34287: variable 'interface0' from source: play vars 27885 1726882544.34337: variable 'interface0' from source: play vars 27885 1726882544.34345: variable 'interface1' from source: play vars 27885 1726882544.34403: variable 'interface1' from source: play vars 27885 1726882544.34406: variable 'interface1' from source: play vars 27885 1726882544.34449: variable 'interface1' from source: play vars 27885 1726882544.34737: Evaluated conditional (__network_wpa_supplicant_required): False 27885 1726882544.34741: when evaluation is False, skipping this task 27885 1726882544.34743: _execute() done 27885 1726882544.34745: dumping result to json 27885 1726882544.34747: done dumping result, returning 27885 1726882544.34749: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-3fa5-01be-000000000028] 27885 1726882544.34750: sending task result for task 12673a56-9f93-3fa5-01be-000000000028 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 27885 1726882544.34845: no more pending results, returning what we have 27885 1726882544.34848: results queue empty 27885 1726882544.34849: checking for any_errors_fatal 27885 1726882544.34865: done checking for any_errors_fatal 27885 1726882544.34865: checking for max_fail_percentage 27885 1726882544.34866: done checking for max_fail_percentage 27885 1726882544.34867: checking to see if all hosts have failed and the running result is not ok 27885 1726882544.34868: done checking to see if all hosts have failed 27885 1726882544.34869: getting the remaining hosts for this loop 27885 1726882544.34870: done getting the remaining hosts for this loop 27885 1726882544.34872: getting the next task for host managed_node2 27885 1726882544.34878: done getting next task for host managed_node2 27885 1726882544.34880: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 27885 1726882544.34883: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882544.34909: done sending task result for task 12673a56-9f93-3fa5-01be-000000000028 27885 1726882544.34912: WORKER PROCESS EXITING 27885 1726882544.34917: getting variables 27885 1726882544.34918: in VariableManager get_vars() 27885 1726882544.34955: Calling all_inventory to load vars for managed_node2 27885 1726882544.34958: Calling groups_inventory to load vars for managed_node2 27885 1726882544.34961: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882544.34969: Calling all_plugins_play to load vars for managed_node2 27885 1726882544.34972: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882544.34975: Calling groups_plugins_play to load vars for managed_node2 27885 1726882544.36259: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882544.37850: done with get_vars() 27885 1726882544.37876: done getting variables 27885 1726882544.37943: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:35:44 -0400 (0:00:00.076) 0:00:17.022 ****** 27885 1726882544.37980: entering _queue_task() for managed_node2/service 27885 1726882544.38310: worker is 1 (out of 1 available) 27885 1726882544.38322: exiting _queue_task() for managed_node2/service 27885 1726882544.38335: done queuing things up, now waiting for results queue to drain 27885 1726882544.38337: waiting for pending results... 27885 1726882544.38722: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 27885 1726882544.38771: in run() - task 12673a56-9f93-3fa5-01be-000000000029 27885 1726882544.38800: variable 'ansible_search_path' from source: unknown 27885 1726882544.38810: variable 'ansible_search_path' from source: unknown 27885 1726882544.38854: calling self._execute() 27885 1726882544.38960: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882544.38973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882544.38992: variable 'omit' from source: magic vars 27885 1726882544.39471: variable 'ansible_distribution_major_version' from source: facts 27885 1726882544.39474: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882544.39595: variable 'network_provider' from source: set_fact 27885 1726882544.39608: Evaluated conditional (network_provider == "initscripts"): False 27885 1726882544.39617: when evaluation is False, skipping this task 27885 1726882544.39692: _execute() done 27885 1726882544.39698: dumping result to json 27885 1726882544.39700: done dumping result, returning 27885 1726882544.39703: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-3fa5-01be-000000000029] 27885 1726882544.39705: sending task result for task 12673a56-9f93-3fa5-01be-000000000029 27885 1726882544.39774: done sending task result for task 12673a56-9f93-3fa5-01be-000000000029 27885 1726882544.39778: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 27885 1726882544.39838: no more pending results, returning what we have 27885 1726882544.39842: results queue empty 27885 1726882544.39844: checking for any_errors_fatal 27885 1726882544.39853: done checking for any_errors_fatal 27885 1726882544.39854: checking for max_fail_percentage 27885 1726882544.39856: done checking for max_fail_percentage 27885 1726882544.39857: checking to see if all hosts have failed and the running result is not ok 27885 1726882544.39858: done checking to see if all hosts have failed 27885 1726882544.39859: getting the remaining hosts for this loop 27885 1726882544.39862: done getting the remaining hosts for this loop 27885 1726882544.39865: getting the next task for host managed_node2 27885 1726882544.39873: done getting next task for host managed_node2 27885 1726882544.39877: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 27885 1726882544.39880: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882544.39902: getting variables 27885 1726882544.39905: in VariableManager get_vars() 27885 1726882544.39948: Calling all_inventory to load vars for managed_node2 27885 1726882544.39951: Calling groups_inventory to load vars for managed_node2 27885 1726882544.39954: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882544.39965: Calling all_plugins_play to load vars for managed_node2 27885 1726882544.39969: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882544.39972: Calling groups_plugins_play to load vars for managed_node2 27885 1726882544.41697: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882544.43264: done with get_vars() 27885 1726882544.43286: done getting variables 27885 1726882544.43349: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:35:44 -0400 (0:00:00.054) 0:00:17.076 ****** 27885 1726882544.43383: entering _queue_task() for managed_node2/copy 27885 1726882544.43682: worker is 1 (out of 1 available) 27885 1726882544.43898: exiting _queue_task() for managed_node2/copy 27885 1726882544.43910: done queuing things up, now waiting for results queue to drain 27885 1726882544.43911: waiting for pending results... 27885 1726882544.44041: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 27885 1726882544.44246: in run() - task 12673a56-9f93-3fa5-01be-00000000002a 27885 1726882544.44250: variable 'ansible_search_path' from source: unknown 27885 1726882544.44252: variable 'ansible_search_path' from source: unknown 27885 1726882544.44254: calling self._execute() 27885 1726882544.44307: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882544.44319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882544.44334: variable 'omit' from source: magic vars 27885 1726882544.44718: variable 'ansible_distribution_major_version' from source: facts 27885 1726882544.44735: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882544.44862: variable 'network_provider' from source: set_fact 27885 1726882544.44873: Evaluated conditional (network_provider == "initscripts"): False 27885 1726882544.44881: when evaluation is False, skipping this task 27885 1726882544.44895: _execute() done 27885 1726882544.44908: dumping result to json 27885 1726882544.44916: done dumping result, returning 27885 1726882544.44928: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-3fa5-01be-00000000002a] 27885 1726882544.44938: sending task result for task 12673a56-9f93-3fa5-01be-00000000002a skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 27885 1726882544.45183: no more pending results, returning what we have 27885 1726882544.45187: results queue empty 27885 1726882544.45188: checking for any_errors_fatal 27885 1726882544.45200: done checking for any_errors_fatal 27885 1726882544.45201: checking for max_fail_percentage 27885 1726882544.45203: done checking for max_fail_percentage 27885 1726882544.45204: checking to see if all hosts have failed and the running result is not ok 27885 1726882544.45205: done checking to see if all hosts have failed 27885 1726882544.45206: getting the remaining hosts for this loop 27885 1726882544.45207: done getting the remaining hosts for this loop 27885 1726882544.45211: getting the next task for host managed_node2 27885 1726882544.45219: done getting next task for host managed_node2 27885 1726882544.45224: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 27885 1726882544.45228: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882544.45246: getting variables 27885 1726882544.45248: in VariableManager get_vars() 27885 1726882544.45289: Calling all_inventory to load vars for managed_node2 27885 1726882544.45463: Calling groups_inventory to load vars for managed_node2 27885 1726882544.45467: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882544.45473: done sending task result for task 12673a56-9f93-3fa5-01be-00000000002a 27885 1726882544.45476: WORKER PROCESS EXITING 27885 1726882544.45483: Calling all_plugins_play to load vars for managed_node2 27885 1726882544.45486: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882544.45495: Calling groups_plugins_play to load vars for managed_node2 27885 1726882544.46761: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882544.48339: done with get_vars() 27885 1726882544.48362: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:35:44 -0400 (0:00:00.050) 0:00:17.126 ****** 27885 1726882544.48450: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 27885 1726882544.48452: Creating lock for fedora.linux_system_roles.network_connections 27885 1726882544.48781: worker is 1 (out of 1 available) 27885 1726882544.48797: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 27885 1726882544.48810: done queuing things up, now waiting for results queue to drain 27885 1726882544.48811: waiting for pending results... 27885 1726882544.49096: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 27885 1726882544.49250: in run() - task 12673a56-9f93-3fa5-01be-00000000002b 27885 1726882544.49272: variable 'ansible_search_path' from source: unknown 27885 1726882544.49281: variable 'ansible_search_path' from source: unknown 27885 1726882544.49329: calling self._execute() 27885 1726882544.49435: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882544.49448: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882544.49463: variable 'omit' from source: magic vars 27885 1726882544.49853: variable 'ansible_distribution_major_version' from source: facts 27885 1726882544.49875: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882544.49974: variable 'omit' from source: magic vars 27885 1726882544.49978: variable 'omit' from source: magic vars 27885 1726882544.50126: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27885 1726882544.53004: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27885 1726882544.53137: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27885 1726882544.53188: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27885 1726882544.53233: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27885 1726882544.53286: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27885 1726882544.53434: variable 'network_provider' from source: set_fact 27885 1726882544.53605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882544.53671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882544.53799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882544.53807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882544.53810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882544.53880: variable 'omit' from source: magic vars 27885 1726882544.54060: variable 'omit' from source: magic vars 27885 1726882544.54180: variable 'network_connections' from source: task vars 27885 1726882544.54203: variable 'interface0' from source: play vars 27885 1726882544.54297: variable 'interface0' from source: play vars 27885 1726882544.54313: variable 'interface0' from source: play vars 27885 1726882544.54398: variable 'interface0' from source: play vars 27885 1726882544.54418: variable 'interface1' from source: play vars 27885 1726882544.54484: variable 'interface1' from source: play vars 27885 1726882544.54503: variable 'interface1' from source: play vars 27885 1726882544.54573: variable 'interface1' from source: play vars 27885 1726882544.54900: variable 'omit' from source: magic vars 27885 1726882544.54903: variable '__lsr_ansible_managed' from source: task vars 27885 1726882544.54922: variable '__lsr_ansible_managed' from source: task vars 27885 1726882544.55226: Loaded config def from plugin (lookup/template) 27885 1726882544.55237: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 27885 1726882544.55269: File lookup term: get_ansible_managed.j2 27885 1726882544.55277: variable 'ansible_search_path' from source: unknown 27885 1726882544.55291: evaluation_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 27885 1726882544.55313: search_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 27885 1726882544.55342: variable 'ansible_search_path' from source: unknown 27885 1726882544.64785: variable 'ansible_managed' from source: unknown 27885 1726882544.64957: variable 'omit' from source: magic vars 27885 1726882544.65039: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882544.65043: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882544.65049: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882544.65067: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882544.65079: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882544.65115: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882544.65124: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882544.65131: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882544.65247: Set connection var ansible_pipelining to False 27885 1726882544.65271: Set connection var ansible_connection to ssh 27885 1726882544.65369: Set connection var ansible_timeout to 10 27885 1726882544.65372: Set connection var ansible_shell_type to sh 27885 1726882544.65374: Set connection var ansible_shell_executable to /bin/sh 27885 1726882544.65376: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882544.65379: variable 'ansible_shell_executable' from source: unknown 27885 1726882544.65381: variable 'ansible_connection' from source: unknown 27885 1726882544.65383: variable 'ansible_module_compression' from source: unknown 27885 1726882544.65385: variable 'ansible_shell_type' from source: unknown 27885 1726882544.65386: variable 'ansible_shell_executable' from source: unknown 27885 1726882544.65388: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882544.65395: variable 'ansible_pipelining' from source: unknown 27885 1726882544.65397: variable 'ansible_timeout' from source: unknown 27885 1726882544.65407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882544.65568: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 27885 1726882544.65587: variable 'omit' from source: magic vars 27885 1726882544.65607: starting attempt loop 27885 1726882544.65614: running the handler 27885 1726882544.65632: _low_level_execute_command(): starting 27885 1726882544.65642: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27885 1726882544.66506: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882544.66534: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882544.66601: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882544.66619: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882544.66641: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882544.66830: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882544.68475: stdout chunk (state=3): >>>/root <<< 27885 1726882544.68607: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882544.68899: stderr chunk (state=3): >>><<< 27885 1726882544.68902: stdout chunk (state=3): >>><<< 27885 1726882544.68905: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882544.68907: _low_level_execute_command(): starting 27885 1726882544.68910: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882544.6871595-28707-151603035106969 `" && echo ansible-tmp-1726882544.6871595-28707-151603035106969="` echo /root/.ansible/tmp/ansible-tmp-1726882544.6871595-28707-151603035106969 `" ) && sleep 0' 27885 1726882544.69916: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882544.70009: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882544.70021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882544.70036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882544.70055: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882544.70058: stderr chunk (state=3): >>>debug2: match not found <<< 27885 1726882544.70065: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882544.70089: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27885 1726882544.70170: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882544.70313: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882544.70627: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882544.72463: stdout chunk (state=3): >>>ansible-tmp-1726882544.6871595-28707-151603035106969=/root/.ansible/tmp/ansible-tmp-1726882544.6871595-28707-151603035106969 <<< 27885 1726882544.72515: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882544.72519: stdout chunk (state=3): >>><<< 27885 1726882544.72521: stderr chunk (state=3): >>><<< 27885 1726882544.72701: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882544.6871595-28707-151603035106969=/root/.ansible/tmp/ansible-tmp-1726882544.6871595-28707-151603035106969 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882544.72705: variable 'ansible_module_compression' from source: unknown 27885 1726882544.72786: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 27885 1726882544.72789: ANSIBALLZ: Acquiring lock 27885 1726882544.72792: ANSIBALLZ: Lock acquired: 140560088147648 27885 1726882544.72805: ANSIBALLZ: Creating module 27885 1726882545.15696: ANSIBALLZ: Writing module into payload 27885 1726882545.16424: ANSIBALLZ: Writing module 27885 1726882545.16500: ANSIBALLZ: Renaming module 27885 1726882545.16503: ANSIBALLZ: Done creating module 27885 1726882545.16505: variable 'ansible_facts' from source: unknown 27885 1726882545.16579: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882544.6871595-28707-151603035106969/AnsiballZ_network_connections.py 27885 1726882545.16910: Sending initial data 27885 1726882545.16913: Sent initial data (168 bytes) 27885 1726882545.18208: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882545.18502: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882545.18601: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882545.20176: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27885 1726882545.20370: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27885 1726882545.20389: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpxn75tayl /root/.ansible/tmp/ansible-tmp-1726882544.6871595-28707-151603035106969/AnsiballZ_network_connections.py <<< 27885 1726882545.20398: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882544.6871595-28707-151603035106969/AnsiballZ_network_connections.py" <<< 27885 1726882545.20634: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpxn75tayl" to remote "/root/.ansible/tmp/ansible-tmp-1726882544.6871595-28707-151603035106969/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882544.6871595-28707-151603035106969/AnsiballZ_network_connections.py" <<< 27885 1726882545.23367: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882545.23376: stdout chunk (state=3): >>><<< 27885 1726882545.23379: stderr chunk (state=3): >>><<< 27885 1726882545.23485: done transferring module to remote 27885 1726882545.23492: _low_level_execute_command(): starting 27885 1726882545.23496: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882544.6871595-28707-151603035106969/ /root/.ansible/tmp/ansible-tmp-1726882544.6871595-28707-151603035106969/AnsiballZ_network_connections.py && sleep 0' 27885 1726882545.24651: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882545.24660: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882545.24671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882545.24686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882545.24703: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882545.24710: stderr chunk (state=3): >>>debug2: match not found <<< 27885 1726882545.24720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882545.24734: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27885 1726882545.24742: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 27885 1726882545.24749: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27885 1726882545.25037: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882545.25042: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882545.25119: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882545.26999: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882545.27003: stdout chunk (state=3): >>><<< 27885 1726882545.27005: stderr chunk (state=3): >>><<< 27885 1726882545.27008: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882545.27010: _low_level_execute_command(): starting 27885 1726882545.27018: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882544.6871595-28707-151603035106969/AnsiballZ_network_connections.py && sleep 0' 27885 1726882545.28209: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882545.28344: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882545.28377: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882545.28447: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882545.95175: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[005] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 4e1ae63c-1d45-4cf6-8c23-7abd72c157ff\n[006] #1, state:up persistent_state:present, 'ethtest1': add connection ethtest1, 9dc4eccb-4733-427c-9f5b-05ec76f599ff\n[007] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 4e1ae63c-1d45-4cf6-8c23-7abd72c157ff (not-active)\n[008] #1, state:up persistent_state:present, 'ethtest1': up connection ethtest1, 9dc4eccb-4733-427c-9f5b-05ec76f599ff (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.3/24", "2001:db8::2/32"], "route": [{"network": "198.51.10.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4}, {"network": "2001:db6::4", "prefix": 128, "gateway": "2001:db8::1", "metric": 2}]}}, {"name": "ethtest1", "interface_name": "ethtest1", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.6/24", "2001:db8::4/32"], "route": [{"network": "198.51.12.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.3/24", "2001:db8::2/32"], "route": [{"network": "198.51.10.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4}, {"network": "2001:db6::4", "prefix": 128, "gateway": "2001:db8::1", "metric": 2}]}}, {"name": "ethtest1", "interface_name": "ethtest1", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.6/24", "2001:db8::4/32"], "route": [{"network": "198.51.12.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 27885 1726882545.97149: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 27885 1726882545.97300: stderr chunk (state=3): >>><<< 27885 1726882545.97304: stdout chunk (state=3): >>><<< 27885 1726882545.97307: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[005] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 4e1ae63c-1d45-4cf6-8c23-7abd72c157ff\n[006] #1, state:up persistent_state:present, 'ethtest1': add connection ethtest1, 9dc4eccb-4733-427c-9f5b-05ec76f599ff\n[007] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 4e1ae63c-1d45-4cf6-8c23-7abd72c157ff (not-active)\n[008] #1, state:up persistent_state:present, 'ethtest1': up connection ethtest1, 9dc4eccb-4733-427c-9f5b-05ec76f599ff (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.3/24", "2001:db8::2/32"], "route": [{"network": "198.51.10.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4}, {"network": "2001:db6::4", "prefix": 128, "gateway": "2001:db8::1", "metric": 2}]}}, {"name": "ethtest1", "interface_name": "ethtest1", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.6/24", "2001:db8::4/32"], "route": [{"network": "198.51.12.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.3/24", "2001:db8::2/32"], "route": [{"network": "198.51.10.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4}, {"network": "2001:db6::4", "prefix": 128, "gateway": "2001:db8::1", "metric": 2}]}}, {"name": "ethtest1", "interface_name": "ethtest1", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.6/24", "2001:db8::4/32"], "route": [{"network": "198.51.12.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 27885 1726882545.97492: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'interface_name': 'ethtest0', 'state': 'up', 'type': 'ethernet', 'autoconnect': False, 'ip': {'address': ['198.51.100.3/24', '2001:db8::2/32'], 'route': [{'network': '198.51.10.64', 'prefix': 26, 'gateway': '198.51.100.6', 'metric': 4}, {'network': '2001:db6::4', 'prefix': 128, 'gateway': '2001:db8::1', 'metric': 2}]}}, {'name': 'ethtest1', 'interface_name': 'ethtest1', 'state': 'up', 'type': 'ethernet', 'autoconnect': False, 'ip': {'address': ['198.51.100.6/24', '2001:db8::4/32'], 'route': [{'network': '198.51.12.128', 'prefix': 26, 'gateway': '198.51.100.1', 'metric': 2}]}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882544.6871595-28707-151603035106969/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27885 1726882545.97498: _low_level_execute_command(): starting 27885 1726882545.97501: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882544.6871595-28707-151603035106969/ > /dev/null 2>&1 && sleep 0' 27885 1726882545.99325: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882545.99576: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882545.99804: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882545.99816: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882546.01764: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882546.01815: stderr chunk (state=3): >>><<< 27885 1726882546.01933: stdout chunk (state=3): >>><<< 27885 1726882546.01937: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882546.01939: handler run complete 27885 1726882546.02070: attempt loop complete, returning result 27885 1726882546.02079: _execute() done 27885 1726882546.02087: dumping result to json 27885 1726882546.02106: done dumping result, returning 27885 1726882546.02165: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-3fa5-01be-00000000002b] 27885 1726882546.02300: sending task result for task 12673a56-9f93-3fa5-01be-00000000002b 27885 1726882546.02719: done sending task result for task 12673a56-9f93-3fa5-01be-00000000002b 27885 1726882546.02723: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "interface_name": "ethtest0", "ip": { "address": [ "198.51.100.3/24", "2001:db8::2/32" ], "route": [ { "gateway": "198.51.100.6", "metric": 4, "network": "198.51.10.64", "prefix": 26 }, { "gateway": "2001:db8::1", "metric": 2, "network": "2001:db6::4", "prefix": 128 } ] }, "name": "ethtest0", "state": "up", "type": "ethernet" }, { "autoconnect": false, "interface_name": "ethtest1", "ip": { "address": [ "198.51.100.6/24", "2001:db8::4/32" ], "route": [ { "gateway": "198.51.100.1", "metric": 2, "network": "198.51.12.128", "prefix": 26 } ] }, "name": "ethtest1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [005] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 4e1ae63c-1d45-4cf6-8c23-7abd72c157ff [006] #1, state:up persistent_state:present, 'ethtest1': add connection ethtest1, 9dc4eccb-4733-427c-9f5b-05ec76f599ff [007] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 4e1ae63c-1d45-4cf6-8c23-7abd72c157ff (not-active) [008] #1, state:up persistent_state:present, 'ethtest1': up connection ethtest1, 9dc4eccb-4733-427c-9f5b-05ec76f599ff (not-active) 27885 1726882546.03035: no more pending results, returning what we have 27885 1726882546.03039: results queue empty 27885 1726882546.03040: checking for any_errors_fatal 27885 1726882546.03054: done checking for any_errors_fatal 27885 1726882546.03055: checking for max_fail_percentage 27885 1726882546.03057: done checking for max_fail_percentage 27885 1726882546.03058: checking to see if all hosts have failed and the running result is not ok 27885 1726882546.03059: done checking to see if all hosts have failed 27885 1726882546.03059: getting the remaining hosts for this loop 27885 1726882546.03061: done getting the remaining hosts for this loop 27885 1726882546.03065: getting the next task for host managed_node2 27885 1726882546.03070: done getting next task for host managed_node2 27885 1726882546.03074: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 27885 1726882546.03077: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882546.03088: getting variables 27885 1726882546.03094: in VariableManager get_vars() 27885 1726882546.03289: Calling all_inventory to load vars for managed_node2 27885 1726882546.03296: Calling groups_inventory to load vars for managed_node2 27885 1726882546.03302: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882546.03315: Calling all_plugins_play to load vars for managed_node2 27885 1726882546.03323: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882546.03327: Calling groups_plugins_play to load vars for managed_node2 27885 1726882546.06942: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882546.10368: done with get_vars() 27885 1726882546.10508: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:35:46 -0400 (0:00:01.621) 0:00:18.748 ****** 27885 1726882546.10599: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 27885 1726882546.10601: Creating lock for fedora.linux_system_roles.network_state 27885 1726882546.11350: worker is 1 (out of 1 available) 27885 1726882546.11361: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 27885 1726882546.11494: done queuing things up, now waiting for results queue to drain 27885 1726882546.11496: waiting for pending results... 27885 1726882546.11849: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 27885 1726882546.12355: in run() - task 12673a56-9f93-3fa5-01be-00000000002c 27885 1726882546.12359: variable 'ansible_search_path' from source: unknown 27885 1726882546.12361: variable 'ansible_search_path' from source: unknown 27885 1726882546.12364: calling self._execute() 27885 1726882546.12384: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882546.12394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882546.12402: variable 'omit' from source: magic vars 27885 1726882546.13196: variable 'ansible_distribution_major_version' from source: facts 27885 1726882546.13204: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882546.13443: variable 'network_state' from source: role '' defaults 27885 1726882546.13447: Evaluated conditional (network_state != {}): False 27885 1726882546.13452: when evaluation is False, skipping this task 27885 1726882546.13552: _execute() done 27885 1726882546.13556: dumping result to json 27885 1726882546.13559: done dumping result, returning 27885 1726882546.13562: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-3fa5-01be-00000000002c] 27885 1726882546.13565: sending task result for task 12673a56-9f93-3fa5-01be-00000000002c skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 27885 1726882546.13943: no more pending results, returning what we have 27885 1726882546.13947: results queue empty 27885 1726882546.13948: checking for any_errors_fatal 27885 1726882546.13966: done checking for any_errors_fatal 27885 1726882546.13967: checking for max_fail_percentage 27885 1726882546.13969: done checking for max_fail_percentage 27885 1726882546.13970: checking to see if all hosts have failed and the running result is not ok 27885 1726882546.13970: done checking to see if all hosts have failed 27885 1726882546.13971: getting the remaining hosts for this loop 27885 1726882546.13973: done getting the remaining hosts for this loop 27885 1726882546.13976: getting the next task for host managed_node2 27885 1726882546.13991: done getting next task for host managed_node2 27885 1726882546.13997: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 27885 1726882546.14000: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882546.14014: getting variables 27885 1726882546.14016: in VariableManager get_vars() 27885 1726882546.14049: Calling all_inventory to load vars for managed_node2 27885 1726882546.14052: Calling groups_inventory to load vars for managed_node2 27885 1726882546.14054: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882546.14063: Calling all_plugins_play to load vars for managed_node2 27885 1726882546.14065: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882546.14068: Calling groups_plugins_play to load vars for managed_node2 27885 1726882546.14699: done sending task result for task 12673a56-9f93-3fa5-01be-00000000002c 27885 1726882546.14702: WORKER PROCESS EXITING 27885 1726882546.16917: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882546.19747: done with get_vars() 27885 1726882546.19770: done getting variables 27885 1726882546.19841: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:35:46 -0400 (0:00:00.092) 0:00:18.841 ****** 27885 1726882546.19874: entering _queue_task() for managed_node2/debug 27885 1726882546.20335: worker is 1 (out of 1 available) 27885 1726882546.20346: exiting _queue_task() for managed_node2/debug 27885 1726882546.20357: done queuing things up, now waiting for results queue to drain 27885 1726882546.20358: waiting for pending results... 27885 1726882546.20553: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 27885 1726882546.20761: in run() - task 12673a56-9f93-3fa5-01be-00000000002d 27885 1726882546.20815: variable 'ansible_search_path' from source: unknown 27885 1726882546.20824: variable 'ansible_search_path' from source: unknown 27885 1726882546.20867: calling self._execute() 27885 1726882546.21004: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882546.21017: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882546.21037: variable 'omit' from source: magic vars 27885 1726882546.21446: variable 'ansible_distribution_major_version' from source: facts 27885 1726882546.21804: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882546.21807: variable 'omit' from source: magic vars 27885 1726882546.21810: variable 'omit' from source: magic vars 27885 1726882546.21910: variable 'omit' from source: magic vars 27885 1726882546.21913: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882546.21916: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882546.21931: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882546.21983: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882546.22008: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882546.22103: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882546.22114: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882546.22128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882546.22355: Set connection var ansible_pipelining to False 27885 1726882546.22675: Set connection var ansible_connection to ssh 27885 1726882546.22678: Set connection var ansible_timeout to 10 27885 1726882546.22680: Set connection var ansible_shell_type to sh 27885 1726882546.22683: Set connection var ansible_shell_executable to /bin/sh 27885 1726882546.22685: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882546.22686: variable 'ansible_shell_executable' from source: unknown 27885 1726882546.22688: variable 'ansible_connection' from source: unknown 27885 1726882546.22690: variable 'ansible_module_compression' from source: unknown 27885 1726882546.22692: variable 'ansible_shell_type' from source: unknown 27885 1726882546.22697: variable 'ansible_shell_executable' from source: unknown 27885 1726882546.22699: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882546.22701: variable 'ansible_pipelining' from source: unknown 27885 1726882546.22703: variable 'ansible_timeout' from source: unknown 27885 1726882546.22705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882546.23200: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882546.23204: variable 'omit' from source: magic vars 27885 1726882546.23206: starting attempt loop 27885 1726882546.23210: running the handler 27885 1726882546.23524: variable '__network_connections_result' from source: set_fact 27885 1726882546.23740: handler run complete 27885 1726882546.23744: attempt loop complete, returning result 27885 1726882546.23746: _execute() done 27885 1726882546.23748: dumping result to json 27885 1726882546.23750: done dumping result, returning 27885 1726882546.23762: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-3fa5-01be-00000000002d] 27885 1726882546.23771: sending task result for task 12673a56-9f93-3fa5-01be-00000000002d 27885 1726882546.24032: done sending task result for task 12673a56-9f93-3fa5-01be-00000000002d 27885 1726882546.24035: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[005] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 4e1ae63c-1d45-4cf6-8c23-7abd72c157ff", "[006] #1, state:up persistent_state:present, 'ethtest1': add connection ethtest1, 9dc4eccb-4733-427c-9f5b-05ec76f599ff", "[007] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 4e1ae63c-1d45-4cf6-8c23-7abd72c157ff (not-active)", "[008] #1, state:up persistent_state:present, 'ethtest1': up connection ethtest1, 9dc4eccb-4733-427c-9f5b-05ec76f599ff (not-active)" ] } 27885 1726882546.24131: no more pending results, returning what we have 27885 1726882546.24135: results queue empty 27885 1726882546.24137: checking for any_errors_fatal 27885 1726882546.24146: done checking for any_errors_fatal 27885 1726882546.24146: checking for max_fail_percentage 27885 1726882546.24148: done checking for max_fail_percentage 27885 1726882546.24149: checking to see if all hosts have failed and the running result is not ok 27885 1726882546.24150: done checking to see if all hosts have failed 27885 1726882546.24151: getting the remaining hosts for this loop 27885 1726882546.24152: done getting the remaining hosts for this loop 27885 1726882546.24156: getting the next task for host managed_node2 27885 1726882546.24163: done getting next task for host managed_node2 27885 1726882546.24167: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 27885 1726882546.24171: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882546.24181: getting variables 27885 1726882546.24183: in VariableManager get_vars() 27885 1726882546.24227: Calling all_inventory to load vars for managed_node2 27885 1726882546.24230: Calling groups_inventory to load vars for managed_node2 27885 1726882546.24233: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882546.24243: Calling all_plugins_play to load vars for managed_node2 27885 1726882546.24247: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882546.24250: Calling groups_plugins_play to load vars for managed_node2 27885 1726882546.25980: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882546.28500: done with get_vars() 27885 1726882546.28538: done getting variables 27885 1726882546.28602: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:35:46 -0400 (0:00:00.087) 0:00:18.928 ****** 27885 1726882546.28650: entering _queue_task() for managed_node2/debug 27885 1726882546.29610: worker is 1 (out of 1 available) 27885 1726882546.29623: exiting _queue_task() for managed_node2/debug 27885 1726882546.29635: done queuing things up, now waiting for results queue to drain 27885 1726882546.29637: waiting for pending results... 27885 1726882546.30318: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 27885 1726882546.30420: in run() - task 12673a56-9f93-3fa5-01be-00000000002e 27885 1726882546.30604: variable 'ansible_search_path' from source: unknown 27885 1726882546.30608: variable 'ansible_search_path' from source: unknown 27885 1726882546.30701: calling self._execute() 27885 1726882546.30938: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882546.30941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882546.31069: variable 'omit' from source: magic vars 27885 1726882546.31528: variable 'ansible_distribution_major_version' from source: facts 27885 1726882546.31542: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882546.31556: variable 'omit' from source: magic vars 27885 1726882546.31832: variable 'omit' from source: magic vars 27885 1726882546.31843: variable 'omit' from source: magic vars 27885 1726882546.32044: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882546.32123: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882546.32143: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882546.32198: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882546.32201: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882546.32204: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882546.32206: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882546.32211: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882546.32399: Set connection var ansible_pipelining to False 27885 1726882546.32402: Set connection var ansible_connection to ssh 27885 1726882546.32405: Set connection var ansible_timeout to 10 27885 1726882546.32407: Set connection var ansible_shell_type to sh 27885 1726882546.32408: Set connection var ansible_shell_executable to /bin/sh 27885 1726882546.32410: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882546.32412: variable 'ansible_shell_executable' from source: unknown 27885 1726882546.32414: variable 'ansible_connection' from source: unknown 27885 1726882546.32417: variable 'ansible_module_compression' from source: unknown 27885 1726882546.32419: variable 'ansible_shell_type' from source: unknown 27885 1726882546.32421: variable 'ansible_shell_executable' from source: unknown 27885 1726882546.32423: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882546.32425: variable 'ansible_pipelining' from source: unknown 27885 1726882546.32427: variable 'ansible_timeout' from source: unknown 27885 1726882546.32429: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882546.32716: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882546.32726: variable 'omit' from source: magic vars 27885 1726882546.32731: starting attempt loop 27885 1726882546.32735: running the handler 27885 1726882546.32907: variable '__network_connections_result' from source: set_fact 27885 1726882546.33061: variable '__network_connections_result' from source: set_fact 27885 1726882546.33464: handler run complete 27885 1726882546.33502: attempt loop complete, returning result 27885 1726882546.33505: _execute() done 27885 1726882546.33508: dumping result to json 27885 1726882546.33514: done dumping result, returning 27885 1726882546.33523: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-3fa5-01be-00000000002e] 27885 1726882546.33527: sending task result for task 12673a56-9f93-3fa5-01be-00000000002e 27885 1726882546.33648: done sending task result for task 12673a56-9f93-3fa5-01be-00000000002e 27885 1726882546.33652: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "interface_name": "ethtest0", "ip": { "address": [ "198.51.100.3/24", "2001:db8::2/32" ], "route": [ { "gateway": "198.51.100.6", "metric": 4, "network": "198.51.10.64", "prefix": 26 }, { "gateway": "2001:db8::1", "metric": 2, "network": "2001:db6::4", "prefix": 128 } ] }, "name": "ethtest0", "state": "up", "type": "ethernet" }, { "autoconnect": false, "interface_name": "ethtest1", "ip": { "address": [ "198.51.100.6/24", "2001:db8::4/32" ], "route": [ { "gateway": "198.51.100.1", "metric": 2, "network": "198.51.12.128", "prefix": 26 } ] }, "name": "ethtest1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[005] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 4e1ae63c-1d45-4cf6-8c23-7abd72c157ff\n[006] #1, state:up persistent_state:present, 'ethtest1': add connection ethtest1, 9dc4eccb-4733-427c-9f5b-05ec76f599ff\n[007] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 4e1ae63c-1d45-4cf6-8c23-7abd72c157ff (not-active)\n[008] #1, state:up persistent_state:present, 'ethtest1': up connection ethtest1, 9dc4eccb-4733-427c-9f5b-05ec76f599ff (not-active)\n", "stderr_lines": [ "[005] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 4e1ae63c-1d45-4cf6-8c23-7abd72c157ff", "[006] #1, state:up persistent_state:present, 'ethtest1': add connection ethtest1, 9dc4eccb-4733-427c-9f5b-05ec76f599ff", "[007] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 4e1ae63c-1d45-4cf6-8c23-7abd72c157ff (not-active)", "[008] #1, state:up persistent_state:present, 'ethtest1': up connection ethtest1, 9dc4eccb-4733-427c-9f5b-05ec76f599ff (not-active)" ] } } 27885 1726882546.33790: no more pending results, returning what we have 27885 1726882546.33795: results queue empty 27885 1726882546.33797: checking for any_errors_fatal 27885 1726882546.33802: done checking for any_errors_fatal 27885 1726882546.33802: checking for max_fail_percentage 27885 1726882546.33804: done checking for max_fail_percentage 27885 1726882546.33805: checking to see if all hosts have failed and the running result is not ok 27885 1726882546.33805: done checking to see if all hosts have failed 27885 1726882546.33806: getting the remaining hosts for this loop 27885 1726882546.33808: done getting the remaining hosts for this loop 27885 1726882546.33811: getting the next task for host managed_node2 27885 1726882546.33817: done getting next task for host managed_node2 27885 1726882546.33820: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 27885 1726882546.33824: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882546.33833: getting variables 27885 1726882546.33835: in VariableManager get_vars() 27885 1726882546.33872: Calling all_inventory to load vars for managed_node2 27885 1726882546.33875: Calling groups_inventory to load vars for managed_node2 27885 1726882546.33877: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882546.33886: Calling all_plugins_play to load vars for managed_node2 27885 1726882546.33888: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882546.33891: Calling groups_plugins_play to load vars for managed_node2 27885 1726882546.36595: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882546.38404: done with get_vars() 27885 1726882546.38431: done getting variables 27885 1726882546.38509: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:35:46 -0400 (0:00:00.098) 0:00:19.027 ****** 27885 1726882546.38543: entering _queue_task() for managed_node2/debug 27885 1726882546.39110: worker is 1 (out of 1 available) 27885 1726882546.39122: exiting _queue_task() for managed_node2/debug 27885 1726882546.39133: done queuing things up, now waiting for results queue to drain 27885 1726882546.39134: waiting for pending results... 27885 1726882546.39332: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 27885 1726882546.39690: in run() - task 12673a56-9f93-3fa5-01be-00000000002f 27885 1726882546.39696: variable 'ansible_search_path' from source: unknown 27885 1726882546.39699: variable 'ansible_search_path' from source: unknown 27885 1726882546.39713: calling self._execute() 27885 1726882546.39863: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882546.40106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882546.40109: variable 'omit' from source: magic vars 27885 1726882546.41306: variable 'ansible_distribution_major_version' from source: facts 27885 1726882546.41466: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882546.41817: variable 'network_state' from source: role '' defaults 27885 1726882546.41833: Evaluated conditional (network_state != {}): False 27885 1726882546.41841: when evaluation is False, skipping this task 27885 1726882546.41850: _execute() done 27885 1726882546.41895: dumping result to json 27885 1726882546.41906: done dumping result, returning 27885 1726882546.42091: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-3fa5-01be-00000000002f] 27885 1726882546.42096: sending task result for task 12673a56-9f93-3fa5-01be-00000000002f 27885 1726882546.42168: done sending task result for task 12673a56-9f93-3fa5-01be-00000000002f 27885 1726882546.42171: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 27885 1726882546.42225: no more pending results, returning what we have 27885 1726882546.42229: results queue empty 27885 1726882546.42230: checking for any_errors_fatal 27885 1726882546.42242: done checking for any_errors_fatal 27885 1726882546.42243: checking for max_fail_percentage 27885 1726882546.42245: done checking for max_fail_percentage 27885 1726882546.42246: checking to see if all hosts have failed and the running result is not ok 27885 1726882546.42247: done checking to see if all hosts have failed 27885 1726882546.42248: getting the remaining hosts for this loop 27885 1726882546.42249: done getting the remaining hosts for this loop 27885 1726882546.42253: getting the next task for host managed_node2 27885 1726882546.42261: done getting next task for host managed_node2 27885 1726882546.42265: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 27885 1726882546.42269: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882546.42288: getting variables 27885 1726882546.42290: in VariableManager get_vars() 27885 1726882546.42336: Calling all_inventory to load vars for managed_node2 27885 1726882546.42338: Calling groups_inventory to load vars for managed_node2 27885 1726882546.42341: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882546.42353: Calling all_plugins_play to load vars for managed_node2 27885 1726882546.42356: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882546.42359: Calling groups_plugins_play to load vars for managed_node2 27885 1726882546.45315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882546.49208: done with get_vars() 27885 1726882546.49247: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:35:46 -0400 (0:00:00.109) 0:00:19.137 ****** 27885 1726882546.49505: entering _queue_task() for managed_node2/ping 27885 1726882546.49519: Creating lock for ping 27885 1726882546.49936: worker is 1 (out of 1 available) 27885 1726882546.49949: exiting _queue_task() for managed_node2/ping 27885 1726882546.49983: done queuing things up, now waiting for results queue to drain 27885 1726882546.49985: waiting for pending results... 27885 1726882546.50165: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 27885 1726882546.50302: in run() - task 12673a56-9f93-3fa5-01be-000000000030 27885 1726882546.50341: variable 'ansible_search_path' from source: unknown 27885 1726882546.50344: variable 'ansible_search_path' from source: unknown 27885 1726882546.50381: calling self._execute() 27885 1726882546.50439: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882546.50442: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882546.50475: variable 'omit' from source: magic vars 27885 1726882546.50764: variable 'ansible_distribution_major_version' from source: facts 27885 1726882546.50774: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882546.50784: variable 'omit' from source: magic vars 27885 1726882546.50855: variable 'omit' from source: magic vars 27885 1726882546.50918: variable 'omit' from source: magic vars 27885 1726882546.50970: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882546.51003: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882546.51033: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882546.51071: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882546.51074: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882546.51109: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882546.51112: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882546.51114: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882546.51255: Set connection var ansible_pipelining to False 27885 1726882546.51260: Set connection var ansible_connection to ssh 27885 1726882546.51271: Set connection var ansible_timeout to 10 27885 1726882546.51273: Set connection var ansible_shell_type to sh 27885 1726882546.51276: Set connection var ansible_shell_executable to /bin/sh 27885 1726882546.51278: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882546.51304: variable 'ansible_shell_executable' from source: unknown 27885 1726882546.51307: variable 'ansible_connection' from source: unknown 27885 1726882546.51310: variable 'ansible_module_compression' from source: unknown 27885 1726882546.51312: variable 'ansible_shell_type' from source: unknown 27885 1726882546.51314: variable 'ansible_shell_executable' from source: unknown 27885 1726882546.51316: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882546.51318: variable 'ansible_pipelining' from source: unknown 27885 1726882546.51320: variable 'ansible_timeout' from source: unknown 27885 1726882546.51498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882546.51509: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 27885 1726882546.51516: variable 'omit' from source: magic vars 27885 1726882546.51521: starting attempt loop 27885 1726882546.51524: running the handler 27885 1726882546.51539: _low_level_execute_command(): starting 27885 1726882546.51546: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27885 1726882546.52456: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882546.52468: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882546.52482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882546.52499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882546.52515: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882546.52518: stderr chunk (state=3): >>>debug2: match not found <<< 27885 1726882546.52529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882546.52543: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27885 1726882546.52554: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 27885 1726882546.52559: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27885 1726882546.52568: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882546.52578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882546.52598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882546.52740: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882546.52744: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882546.52747: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882546.52749: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882546.52751: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882546.52872: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882546.54596: stdout chunk (state=3): >>>/root <<< 27885 1726882546.54738: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882546.54813: stderr chunk (state=3): >>><<< 27885 1726882546.54818: stdout chunk (state=3): >>><<< 27885 1726882546.54860: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882546.54899: _low_level_execute_command(): starting 27885 1726882546.54904: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882546.5486066-28779-117915014634272 `" && echo ansible-tmp-1726882546.5486066-28779-117915014634272="` echo /root/.ansible/tmp/ansible-tmp-1726882546.5486066-28779-117915014634272 `" ) && sleep 0' 27885 1726882546.56006: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882546.56011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882546.56024: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882546.56307: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882546.56437: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882546.58379: stdout chunk (state=3): >>>ansible-tmp-1726882546.5486066-28779-117915014634272=/root/.ansible/tmp/ansible-tmp-1726882546.5486066-28779-117915014634272 <<< 27885 1726882546.58506: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882546.58546: stderr chunk (state=3): >>><<< 27885 1726882546.58561: stdout chunk (state=3): >>><<< 27885 1726882546.58585: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882546.5486066-28779-117915014634272=/root/.ansible/tmp/ansible-tmp-1726882546.5486066-28779-117915014634272 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882546.58648: variable 'ansible_module_compression' from source: unknown 27885 1726882546.58698: ANSIBALLZ: Using lock for ping 27885 1726882546.58705: ANSIBALLZ: Acquiring lock 27885 1726882546.58712: ANSIBALLZ: Lock acquired: 140560084983232 27885 1726882546.58718: ANSIBALLZ: Creating module 27885 1726882546.71380: ANSIBALLZ: Writing module into payload 27885 1726882546.71441: ANSIBALLZ: Writing module 27885 1726882546.71458: ANSIBALLZ: Renaming module 27885 1726882546.71465: ANSIBALLZ: Done creating module 27885 1726882546.71482: variable 'ansible_facts' from source: unknown 27885 1726882546.71544: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882546.5486066-28779-117915014634272/AnsiballZ_ping.py 27885 1726882546.71657: Sending initial data 27885 1726882546.71660: Sent initial data (153 bytes) 27885 1726882546.72175: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882546.72179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 27885 1726882546.72181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882546.72184: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882546.72186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882546.72242: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882546.72245: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882546.72321: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882546.74099: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27885 1726882546.74135: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27885 1726882546.74216: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpub9svmp6 /root/.ansible/tmp/ansible-tmp-1726882546.5486066-28779-117915014634272/AnsiballZ_ping.py <<< 27885 1726882546.74219: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882546.5486066-28779-117915014634272/AnsiballZ_ping.py" <<< 27885 1726882546.74288: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpub9svmp6" to remote "/root/.ansible/tmp/ansible-tmp-1726882546.5486066-28779-117915014634272/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882546.5486066-28779-117915014634272/AnsiballZ_ping.py" <<< 27885 1726882546.75069: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882546.75148: stderr chunk (state=3): >>><<< 27885 1726882546.75151: stdout chunk (state=3): >>><<< 27885 1726882546.75167: done transferring module to remote 27885 1726882546.75178: _low_level_execute_command(): starting 27885 1726882546.75185: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882546.5486066-28779-117915014634272/ /root/.ansible/tmp/ansible-tmp-1726882546.5486066-28779-117915014634272/AnsiballZ_ping.py && sleep 0' 27885 1726882546.75715: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882546.75725: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882546.75780: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882546.77698: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882546.77702: stderr chunk (state=3): >>><<< 27885 1726882546.77704: stdout chunk (state=3): >>><<< 27885 1726882546.77706: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882546.77709: _low_level_execute_command(): starting 27885 1726882546.77711: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882546.5486066-28779-117915014634272/AnsiballZ_ping.py && sleep 0' 27885 1726882546.78261: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882546.78269: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882546.78280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882546.78296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882546.78308: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882546.78315: stderr chunk (state=3): >>>debug2: match not found <<< 27885 1726882546.78324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882546.78338: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27885 1726882546.78354: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 27885 1726882546.78362: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27885 1726882546.78370: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882546.78411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882546.78468: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882546.78509: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882546.78585: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882546.93557: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 27885 1726882546.94837: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 27885 1726882546.94865: stderr chunk (state=3): >>><<< 27885 1726882546.94869: stdout chunk (state=3): >>><<< 27885 1726882546.94887: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 27885 1726882546.94910: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882546.5486066-28779-117915014634272/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27885 1726882546.94918: _low_level_execute_command(): starting 27885 1726882546.94923: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882546.5486066-28779-117915014634272/ > /dev/null 2>&1 && sleep 0' 27885 1726882546.95465: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882546.95468: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882546.95470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882546.95473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882546.95475: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882546.95477: stderr chunk (state=3): >>>debug2: match not found <<< 27885 1726882546.95479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882546.95606: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882546.95609: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882546.95617: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882546.95619: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882546.95674: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882546.97548: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882546.97572: stderr chunk (state=3): >>><<< 27885 1726882546.97575: stdout chunk (state=3): >>><<< 27885 1726882546.97588: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882546.97597: handler run complete 27885 1726882546.97611: attempt loop complete, returning result 27885 1726882546.97613: _execute() done 27885 1726882546.97616: dumping result to json 27885 1726882546.97619: done dumping result, returning 27885 1726882546.97627: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-3fa5-01be-000000000030] 27885 1726882546.97631: sending task result for task 12673a56-9f93-3fa5-01be-000000000030 27885 1726882546.97715: done sending task result for task 12673a56-9f93-3fa5-01be-000000000030 27885 1726882546.97717: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 27885 1726882546.97776: no more pending results, returning what we have 27885 1726882546.97779: results queue empty 27885 1726882546.97780: checking for any_errors_fatal 27885 1726882546.97785: done checking for any_errors_fatal 27885 1726882546.97786: checking for max_fail_percentage 27885 1726882546.97787: done checking for max_fail_percentage 27885 1726882546.97788: checking to see if all hosts have failed and the running result is not ok 27885 1726882546.97791: done checking to see if all hosts have failed 27885 1726882546.97792: getting the remaining hosts for this loop 27885 1726882546.97796: done getting the remaining hosts for this loop 27885 1726882546.97799: getting the next task for host managed_node2 27885 1726882546.97807: done getting next task for host managed_node2 27885 1726882546.97809: ^ task is: TASK: meta (role_complete) 27885 1726882546.97812: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882546.97823: getting variables 27885 1726882546.97824: in VariableManager get_vars() 27885 1726882546.97875: Calling all_inventory to load vars for managed_node2 27885 1726882546.97879: Calling groups_inventory to load vars for managed_node2 27885 1726882546.97881: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882546.97895: Calling all_plugins_play to load vars for managed_node2 27885 1726882546.97898: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882546.97905: Calling groups_plugins_play to load vars for managed_node2 27885 1726882546.99544: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882547.01272: done with get_vars() 27885 1726882547.01303: done getting variables 27885 1726882547.01400: done queuing things up, now waiting for results queue to drain 27885 1726882547.01402: results queue empty 27885 1726882547.01403: checking for any_errors_fatal 27885 1726882547.01406: done checking for any_errors_fatal 27885 1726882547.01406: checking for max_fail_percentage 27885 1726882547.01407: done checking for max_fail_percentage 27885 1726882547.01408: checking to see if all hosts have failed and the running result is not ok 27885 1726882547.01409: done checking to see if all hosts have failed 27885 1726882547.01409: getting the remaining hosts for this loop 27885 1726882547.01410: done getting the remaining hosts for this loop 27885 1726882547.01413: getting the next task for host managed_node2 27885 1726882547.01417: done getting next task for host managed_node2 27885 1726882547.01419: ^ task is: TASK: Get the IPv4 routes from the route table main 27885 1726882547.01421: ^ state is: HOST STATE: block=3, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882547.01423: getting variables 27885 1726882547.01424: in VariableManager get_vars() 27885 1726882547.01438: Calling all_inventory to load vars for managed_node2 27885 1726882547.01440: Calling groups_inventory to load vars for managed_node2 27885 1726882547.01442: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882547.01447: Calling all_plugins_play to load vars for managed_node2 27885 1726882547.01449: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882547.01452: Calling groups_plugins_play to load vars for managed_node2 27885 1726882547.02698: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882547.04497: done with get_vars() 27885 1726882547.04520: done getting variables 27885 1726882547.04566: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get the IPv4 routes from the route table main] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:73 Friday 20 September 2024 21:35:47 -0400 (0:00:00.550) 0:00:19.688 ****** 27885 1726882547.04605: entering _queue_task() for managed_node2/command 27885 1726882547.04988: worker is 1 (out of 1 available) 27885 1726882547.05207: exiting _queue_task() for managed_node2/command 27885 1726882547.05219: done queuing things up, now waiting for results queue to drain 27885 1726882547.05220: waiting for pending results... 27885 1726882547.05375: running TaskExecutor() for managed_node2/TASK: Get the IPv4 routes from the route table main 27885 1726882547.05445: in run() - task 12673a56-9f93-3fa5-01be-000000000060 27885 1726882547.05469: variable 'ansible_search_path' from source: unknown 27885 1726882547.05507: calling self._execute() 27885 1726882547.05619: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882547.05625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882547.05641: variable 'omit' from source: magic vars 27885 1726882547.06067: variable 'ansible_distribution_major_version' from source: facts 27885 1726882547.06084: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882547.06090: variable 'omit' from source: magic vars 27885 1726882547.06126: variable 'omit' from source: magic vars 27885 1726882547.06225: variable 'omit' from source: magic vars 27885 1726882547.06229: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882547.06254: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882547.06274: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882547.06302: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882547.06318: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882547.06354: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882547.06358: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882547.06362: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882547.06767: Set connection var ansible_pipelining to False 27885 1726882547.06770: Set connection var ansible_connection to ssh 27885 1726882547.06773: Set connection var ansible_timeout to 10 27885 1726882547.06775: Set connection var ansible_shell_type to sh 27885 1726882547.06777: Set connection var ansible_shell_executable to /bin/sh 27885 1726882547.06779: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882547.06781: variable 'ansible_shell_executable' from source: unknown 27885 1726882547.06784: variable 'ansible_connection' from source: unknown 27885 1726882547.06786: variable 'ansible_module_compression' from source: unknown 27885 1726882547.06788: variable 'ansible_shell_type' from source: unknown 27885 1726882547.06790: variable 'ansible_shell_executable' from source: unknown 27885 1726882547.06792: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882547.06798: variable 'ansible_pipelining' from source: unknown 27885 1726882547.06801: variable 'ansible_timeout' from source: unknown 27885 1726882547.06803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882547.06806: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882547.06809: variable 'omit' from source: magic vars 27885 1726882547.06812: starting attempt loop 27885 1726882547.06814: running the handler 27885 1726882547.06817: _low_level_execute_command(): starting 27885 1726882547.06819: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27885 1726882547.07548: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882547.07811: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882547.07814: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882547.07816: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882547.09504: stdout chunk (state=3): >>>/root <<< 27885 1726882547.09804: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882547.09808: stdout chunk (state=3): >>><<< 27885 1726882547.09810: stderr chunk (state=3): >>><<< 27885 1726882547.09814: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882547.09817: _low_level_execute_command(): starting 27885 1726882547.09820: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882547.0967505-28801-411513450729 `" && echo ansible-tmp-1726882547.0967505-28801-411513450729="` echo /root/.ansible/tmp/ansible-tmp-1726882547.0967505-28801-411513450729 `" ) && sleep 0' 27885 1726882547.10407: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882547.10447: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882547.10458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882547.10479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882547.10487: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882547.10515: stderr chunk (state=3): >>>debug2: match not found <<< 27885 1726882547.10564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882547.10779: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882547.10816: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882547.11020: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882547.12872: stdout chunk (state=3): >>>ansible-tmp-1726882547.0967505-28801-411513450729=/root/.ansible/tmp/ansible-tmp-1726882547.0967505-28801-411513450729 <<< 27885 1726882547.12980: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882547.13016: stderr chunk (state=3): >>><<< 27885 1726882547.13018: stdout chunk (state=3): >>><<< 27885 1726882547.13029: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882547.0967505-28801-411513450729=/root/.ansible/tmp/ansible-tmp-1726882547.0967505-28801-411513450729 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882547.13102: variable 'ansible_module_compression' from source: unknown 27885 1726882547.13105: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27885_1t3g5hz/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27885 1726882547.13130: variable 'ansible_facts' from source: unknown 27885 1726882547.13182: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882547.0967505-28801-411513450729/AnsiballZ_command.py 27885 1726882547.13281: Sending initial data 27885 1726882547.13286: Sent initial data (153 bytes) 27885 1726882547.13679: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882547.13713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882547.13716: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 27885 1726882547.13719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration <<< 27885 1726882547.13721: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882547.13723: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882547.13778: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882547.13780: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882547.13834: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882547.15662: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27885 1726882547.15702: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882547.0967505-28801-411513450729/AnsiballZ_command.py" <<< 27885 1726882547.15710: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpoo1q8nyb /root/.ansible/tmp/ansible-tmp-1726882547.0967505-28801-411513450729/AnsiballZ_command.py <<< 27885 1726882547.15923: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpoo1q8nyb" to remote "/root/.ansible/tmp/ansible-tmp-1726882547.0967505-28801-411513450729/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882547.0967505-28801-411513450729/AnsiballZ_command.py" <<< 27885 1726882547.17018: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882547.17140: stderr chunk (state=3): >>><<< 27885 1726882547.17143: stdout chunk (state=3): >>><<< 27885 1726882547.17145: done transferring module to remote 27885 1726882547.17148: _low_level_execute_command(): starting 27885 1726882547.17150: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882547.0967505-28801-411513450729/ /root/.ansible/tmp/ansible-tmp-1726882547.0967505-28801-411513450729/AnsiballZ_command.py && sleep 0' 27885 1726882547.17936: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882547.18250: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882547.18385: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882547.20062: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882547.20107: stderr chunk (state=3): >>><<< 27885 1726882547.20117: stdout chunk (state=3): >>><<< 27885 1726882547.20138: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882547.20150: _low_level_execute_command(): starting 27885 1726882547.20159: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882547.0967505-28801-411513450729/AnsiballZ_command.py && sleep 0' 27885 1726882547.20697: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882547.20712: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882547.20725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882547.20742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882547.20759: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882547.20851: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882547.20869: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882547.20885: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882547.21010: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882547.36723: stdout chunk (state=3): >>> {"changed": true, "stdout": "default via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.69 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.69 metric 100 \n192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown \n198.51.10.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4 \n198.51.12.128/26 via 198.51.100.1 dev ethtest1 proto static metric 2 \n198.51.100.0/24 dev ethtest0 proto kernel scope link src 198.51.100.3 metric 103 \n198.51.100.0/24 dev ethtest1 proto kernel scope link src 198.51.100.6 metric 104 ", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "route"], "start": "2024-09-20 21:35:47.361338", "end": "2024-09-20 21:35:47.365845", "delta": "0:00:00.004507", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27885 1726882547.38255: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 27885 1726882547.38286: stderr chunk (state=3): >>><<< 27885 1726882547.38295: stdout chunk (state=3): >>><<< 27885 1726882547.38315: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "default via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.69 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.69 metric 100 \n192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown \n198.51.10.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4 \n198.51.12.128/26 via 198.51.100.1 dev ethtest1 proto static metric 2 \n198.51.100.0/24 dev ethtest0 proto kernel scope link src 198.51.100.3 metric 103 \n198.51.100.0/24 dev ethtest1 proto kernel scope link src 198.51.100.6 metric 104 ", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "route"], "start": "2024-09-20 21:35:47.361338", "end": "2024-09-20 21:35:47.365845", "delta": "0:00:00.004507", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 27885 1726882547.38375: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -4 route', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882547.0967505-28801-411513450729/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27885 1726882547.38379: _low_level_execute_command(): starting 27885 1726882547.38381: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882547.0967505-28801-411513450729/ > /dev/null 2>&1 && sleep 0' 27885 1726882547.38992: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882547.38997: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882547.39011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882547.39128: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882547.39132: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882547.39226: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882547.41134: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882547.41137: stdout chunk (state=3): >>><<< 27885 1726882547.41145: stderr chunk (state=3): >>><<< 27885 1726882547.41299: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882547.41303: handler run complete 27885 1726882547.41305: Evaluated conditional (False): False 27885 1726882547.41307: attempt loop complete, returning result 27885 1726882547.41309: _execute() done 27885 1726882547.41311: dumping result to json 27885 1726882547.41313: done dumping result, returning 27885 1726882547.41315: done running TaskExecutor() for managed_node2/TASK: Get the IPv4 routes from the route table main [12673a56-9f93-3fa5-01be-000000000060] 27885 1726882547.41317: sending task result for task 12673a56-9f93-3fa5-01be-000000000060 27885 1726882547.41388: done sending task result for task 12673a56-9f93-3fa5-01be-000000000060 27885 1726882547.41396: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ip", "-4", "route" ], "delta": "0:00:00.004507", "end": "2024-09-20 21:35:47.365845", "rc": 0, "start": "2024-09-20 21:35:47.361338" } STDOUT: default via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.69 metric 100 10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.69 metric 100 192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown 198.51.10.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4 198.51.12.128/26 via 198.51.100.1 dev ethtest1 proto static metric 2 198.51.100.0/24 dev ethtest0 proto kernel scope link src 198.51.100.3 metric 103 198.51.100.0/24 dev ethtest1 proto kernel scope link src 198.51.100.6 metric 104 27885 1726882547.41477: no more pending results, returning what we have 27885 1726882547.41480: results queue empty 27885 1726882547.41481: checking for any_errors_fatal 27885 1726882547.41483: done checking for any_errors_fatal 27885 1726882547.41483: checking for max_fail_percentage 27885 1726882547.41485: done checking for max_fail_percentage 27885 1726882547.41486: checking to see if all hosts have failed and the running result is not ok 27885 1726882547.41487: done checking to see if all hosts have failed 27885 1726882547.41487: getting the remaining hosts for this loop 27885 1726882547.41492: done getting the remaining hosts for this loop 27885 1726882547.41497: getting the next task for host managed_node2 27885 1726882547.41504: done getting next task for host managed_node2 27885 1726882547.41507: ^ task is: TASK: Assert that the route table main contains the specified IPv4 routes 27885 1726882547.41510: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882547.41514: getting variables 27885 1726882547.41516: in VariableManager get_vars() 27885 1726882547.41557: Calling all_inventory to load vars for managed_node2 27885 1726882547.41560: Calling groups_inventory to load vars for managed_node2 27885 1726882547.41562: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882547.41575: Calling all_plugins_play to load vars for managed_node2 27885 1726882547.41577: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882547.41580: Calling groups_plugins_play to load vars for managed_node2 27885 1726882547.52043: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882547.54453: done with get_vars() 27885 1726882547.54481: done getting variables 27885 1726882547.54644: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the route table main contains the specified IPv4 routes] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:78 Friday 20 September 2024 21:35:47 -0400 (0:00:00.501) 0:00:20.189 ****** 27885 1726882547.54732: entering _queue_task() for managed_node2/assert 27885 1726882547.55687: worker is 1 (out of 1 available) 27885 1726882547.55814: exiting _queue_task() for managed_node2/assert 27885 1726882547.55826: done queuing things up, now waiting for results queue to drain 27885 1726882547.55828: waiting for pending results... 27885 1726882547.56417: running TaskExecutor() for managed_node2/TASK: Assert that the route table main contains the specified IPv4 routes 27885 1726882547.56423: in run() - task 12673a56-9f93-3fa5-01be-000000000061 27885 1726882547.56466: variable 'ansible_search_path' from source: unknown 27885 1726882547.56497: calling self._execute() 27885 1726882547.56635: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882547.56639: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882547.56648: variable 'omit' from source: magic vars 27885 1726882547.57184: variable 'ansible_distribution_major_version' from source: facts 27885 1726882547.57188: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882547.57196: variable 'omit' from source: magic vars 27885 1726882547.57210: variable 'omit' from source: magic vars 27885 1726882547.57298: variable 'omit' from source: magic vars 27885 1726882547.57363: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882547.57453: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882547.57469: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882547.57521: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882547.57525: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882547.57569: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882547.57630: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882547.57634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882547.57727: Set connection var ansible_pipelining to False 27885 1726882547.57755: Set connection var ansible_connection to ssh 27885 1726882547.57768: Set connection var ansible_timeout to 10 27885 1726882547.57787: Set connection var ansible_shell_type to sh 27885 1726882547.57810: Set connection var ansible_shell_executable to /bin/sh 27885 1726882547.57846: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882547.57872: variable 'ansible_shell_executable' from source: unknown 27885 1726882547.57897: variable 'ansible_connection' from source: unknown 27885 1726882547.57900: variable 'ansible_module_compression' from source: unknown 27885 1726882547.57960: variable 'ansible_shell_type' from source: unknown 27885 1726882547.57963: variable 'ansible_shell_executable' from source: unknown 27885 1726882547.57966: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882547.57968: variable 'ansible_pipelining' from source: unknown 27885 1726882547.57970: variable 'ansible_timeout' from source: unknown 27885 1726882547.57973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882547.58207: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882547.58241: variable 'omit' from source: magic vars 27885 1726882547.58295: starting attempt loop 27885 1726882547.58299: running the handler 27885 1726882547.58481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27885 1726882547.58827: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27885 1726882547.58913: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27885 1726882547.58969: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27885 1726882547.59017: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27885 1726882547.59134: variable 'route_table_main_ipv4' from source: set_fact 27885 1726882547.59263: Evaluated conditional (route_table_main_ipv4.stdout is search("198.51.10.64/26 via 198.51.100.6 dev ethtest0\s+(proto static )?metric 4")): True 27885 1726882547.59429: variable 'route_table_main_ipv4' from source: set_fact 27885 1726882547.59495: Evaluated conditional (route_table_main_ipv4.stdout is search("198.51.12.128/26 via 198.51.100.1 dev ethtest1\s+(proto static )?metric 2")): True 27885 1726882547.59505: handler run complete 27885 1726882547.59521: attempt loop complete, returning result 27885 1726882547.59527: _execute() done 27885 1726882547.59588: dumping result to json 27885 1726882547.59596: done dumping result, returning 27885 1726882547.59599: done running TaskExecutor() for managed_node2/TASK: Assert that the route table main contains the specified IPv4 routes [12673a56-9f93-3fa5-01be-000000000061] 27885 1726882547.59602: sending task result for task 12673a56-9f93-3fa5-01be-000000000061 ok: [managed_node2] => { "changed": false } MSG: All assertions passed 27885 1726882547.59742: no more pending results, returning what we have 27885 1726882547.59746: results queue empty 27885 1726882547.59747: checking for any_errors_fatal 27885 1726882547.59760: done checking for any_errors_fatal 27885 1726882547.59761: checking for max_fail_percentage 27885 1726882547.59762: done checking for max_fail_percentage 27885 1726882547.59763: checking to see if all hosts have failed and the running result is not ok 27885 1726882547.59764: done checking to see if all hosts have failed 27885 1726882547.59765: getting the remaining hosts for this loop 27885 1726882547.59766: done getting the remaining hosts for this loop 27885 1726882547.59770: getting the next task for host managed_node2 27885 1726882547.59776: done getting next task for host managed_node2 27885 1726882547.59778: ^ task is: TASK: Get the IPv6 routes from the route table main 27885 1726882547.59780: ^ state is: HOST STATE: block=3, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882547.59784: getting variables 27885 1726882547.59786: in VariableManager get_vars() 27885 1726882547.59836: Calling all_inventory to load vars for managed_node2 27885 1726882547.59840: Calling groups_inventory to load vars for managed_node2 27885 1726882547.59843: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882547.59854: Calling all_plugins_play to load vars for managed_node2 27885 1726882547.59857: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882547.59861: Calling groups_plugins_play to load vars for managed_node2 27885 1726882547.60535: done sending task result for task 12673a56-9f93-3fa5-01be-000000000061 27885 1726882547.60545: WORKER PROCESS EXITING 27885 1726882547.62894: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882547.65080: done with get_vars() 27885 1726882547.65110: done getting variables 27885 1726882547.65178: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get the IPv6 routes from the route table main] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:89 Friday 20 September 2024 21:35:47 -0400 (0:00:00.104) 0:00:20.294 ****** 27885 1726882547.65216: entering _queue_task() for managed_node2/command 27885 1726882547.65646: worker is 1 (out of 1 available) 27885 1726882547.65659: exiting _queue_task() for managed_node2/command 27885 1726882547.65672: done queuing things up, now waiting for results queue to drain 27885 1726882547.65673: waiting for pending results... 27885 1726882547.66032: running TaskExecutor() for managed_node2/TASK: Get the IPv6 routes from the route table main 27885 1726882547.66221: in run() - task 12673a56-9f93-3fa5-01be-000000000062 27885 1726882547.66262: variable 'ansible_search_path' from source: unknown 27885 1726882547.66346: calling self._execute() 27885 1726882547.66478: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882547.66482: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882547.66485: variable 'omit' from source: magic vars 27885 1726882547.67463: variable 'ansible_distribution_major_version' from source: facts 27885 1726882547.67467: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882547.67472: variable 'omit' from source: magic vars 27885 1726882547.67699: variable 'omit' from source: magic vars 27885 1726882547.67855: variable 'omit' from source: magic vars 27885 1726882547.67911: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882547.67961: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882547.67984: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882547.68018: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882547.68054: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882547.68101: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882547.68111: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882547.68119: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882547.68237: Set connection var ansible_pipelining to False 27885 1726882547.68255: Set connection var ansible_connection to ssh 27885 1726882547.68267: Set connection var ansible_timeout to 10 27885 1726882547.68322: Set connection var ansible_shell_type to sh 27885 1726882547.68324: Set connection var ansible_shell_executable to /bin/sh 27885 1726882547.68326: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882547.68328: variable 'ansible_shell_executable' from source: unknown 27885 1726882547.68330: variable 'ansible_connection' from source: unknown 27885 1726882547.68331: variable 'ansible_module_compression' from source: unknown 27885 1726882547.68333: variable 'ansible_shell_type' from source: unknown 27885 1726882547.68337: variable 'ansible_shell_executable' from source: unknown 27885 1726882547.68343: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882547.68349: variable 'ansible_pipelining' from source: unknown 27885 1726882547.68362: variable 'ansible_timeout' from source: unknown 27885 1726882547.68368: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882547.68522: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882547.68540: variable 'omit' from source: magic vars 27885 1726882547.68577: starting attempt loop 27885 1726882547.68580: running the handler 27885 1726882547.68585: _low_level_execute_command(): starting 27885 1726882547.68604: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27885 1726882547.69458: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration <<< 27885 1726882547.69552: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882547.69585: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882547.69710: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882547.71747: stdout chunk (state=3): >>>/root <<< 27885 1726882547.72005: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882547.72009: stdout chunk (state=3): >>><<< 27885 1726882547.72012: stderr chunk (state=3): >>><<< 27885 1726882547.72018: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882547.72021: _low_level_execute_command(): starting 27885 1726882547.72024: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882547.7196302-28841-275245425351949 `" && echo ansible-tmp-1726882547.7196302-28841-275245425351949="` echo /root/.ansible/tmp/ansible-tmp-1726882547.7196302-28841-275245425351949 `" ) && sleep 0' 27885 1726882547.72809: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882547.72865: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882547.72958: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882547.73011: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882547.73052: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882547.73146: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882547.74978: stdout chunk (state=3): >>>ansible-tmp-1726882547.7196302-28841-275245425351949=/root/.ansible/tmp/ansible-tmp-1726882547.7196302-28841-275245425351949 <<< 27885 1726882547.75137: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882547.75149: stderr chunk (state=3): >>><<< 27885 1726882547.75162: stdout chunk (state=3): >>><<< 27885 1726882547.75202: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882547.7196302-28841-275245425351949=/root/.ansible/tmp/ansible-tmp-1726882547.7196302-28841-275245425351949 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882547.75236: variable 'ansible_module_compression' from source: unknown 27885 1726882547.75299: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27885_1t3g5hz/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27885 1726882547.75398: variable 'ansible_facts' from source: unknown 27885 1726882547.75457: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882547.7196302-28841-275245425351949/AnsiballZ_command.py 27885 1726882547.75655: Sending initial data 27885 1726882547.75658: Sent initial data (156 bytes) 27885 1726882547.76432: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882547.76498: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882547.76715: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882547.76804: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882547.78344: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27885 1726882547.78408: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27885 1726882547.78628: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmp_3m8uv4s /root/.ansible/tmp/ansible-tmp-1726882547.7196302-28841-275245425351949/AnsiballZ_command.py <<< 27885 1726882547.78632: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882547.7196302-28841-275245425351949/AnsiballZ_command.py" <<< 27885 1726882547.78715: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmp_3m8uv4s" to remote "/root/.ansible/tmp/ansible-tmp-1726882547.7196302-28841-275245425351949/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882547.7196302-28841-275245425351949/AnsiballZ_command.py" <<< 27885 1726882547.79761: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882547.79799: stderr chunk (state=3): >>><<< 27885 1726882547.79824: stdout chunk (state=3): >>><<< 27885 1726882547.79855: done transferring module to remote 27885 1726882547.79884: _low_level_execute_command(): starting 27885 1726882547.79899: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882547.7196302-28841-275245425351949/ /root/.ansible/tmp/ansible-tmp-1726882547.7196302-28841-275245425351949/AnsiballZ_command.py && sleep 0' 27885 1726882547.80688: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882547.80700: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882547.80711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882547.80731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882547.80749: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882547.80759: stderr chunk (state=3): >>>debug2: match not found <<< 27885 1726882547.80908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882547.80912: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882547.80914: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882547.80961: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882547.81050: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882547.82821: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882547.82929: stderr chunk (state=3): >>><<< 27885 1726882547.82932: stdout chunk (state=3): >>><<< 27885 1726882547.82935: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882547.82938: _low_level_execute_command(): starting 27885 1726882547.82940: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882547.7196302-28841-275245425351949/AnsiballZ_command.py && sleep 0' 27885 1726882547.83963: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882547.83966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882547.84117: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882547.84121: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 27885 1726882547.84125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882547.84172: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882547.84363: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882547.84463: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882547.99630: stdout chunk (state=3): >>> {"changed": true, "stdout": "2001:db6::4 via 2001:db8::1 dev ethtest0 proto static metric 2 pref medium\n2001:db8::/32 dev ethtest0 proto kernel metric 103 pref medium\n2001:db8::/32 dev ethtest1 proto kernel metric 104 pref medium\nfe80::/64 dev peerethtest0 proto kernel metric 256 pref medium\nfe80::/64 dev peerethtest1 proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nfe80::/64 dev ethtest0 proto kernel metric 1024 pref medium\nfe80::/64 dev ethtest1 proto kernel metric 1024 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-20 21:35:47.992049", "end": "2024-09-20 21:35:47.995450", "delta": "0:00:00.003401", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27885 1726882548.01265: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 27885 1726882548.01269: stdout chunk (state=3): >>><<< 27885 1726882548.01272: stderr chunk (state=3): >>><<< 27885 1726882548.01276: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "2001:db6::4 via 2001:db8::1 dev ethtest0 proto static metric 2 pref medium\n2001:db8::/32 dev ethtest0 proto kernel metric 103 pref medium\n2001:db8::/32 dev ethtest1 proto kernel metric 104 pref medium\nfe80::/64 dev peerethtest0 proto kernel metric 256 pref medium\nfe80::/64 dev peerethtest1 proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nfe80::/64 dev ethtest0 proto kernel metric 1024 pref medium\nfe80::/64 dev ethtest1 proto kernel metric 1024 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-20 21:35:47.992049", "end": "2024-09-20 21:35:47.995450", "delta": "0:00:00.003401", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 27885 1726882548.01281: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 route', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882547.7196302-28841-275245425351949/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27885 1726882548.01284: _low_level_execute_command(): starting 27885 1726882548.01287: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882547.7196302-28841-275245425351949/ > /dev/null 2>&1 && sleep 0' 27885 1726882548.02084: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882548.02107: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882548.02110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882548.02124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882548.02137: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882548.02145: stderr chunk (state=3): >>>debug2: match not found <<< 27885 1726882548.02155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882548.02178: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27885 1726882548.02186: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 27885 1726882548.02196: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27885 1726882548.02225: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882548.02296: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882548.02300: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882548.02329: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882548.02419: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882548.04221: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882548.04238: stderr chunk (state=3): >>><<< 27885 1726882548.04241: stdout chunk (state=3): >>><<< 27885 1726882548.04253: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882548.04259: handler run complete 27885 1726882548.04275: Evaluated conditional (False): False 27885 1726882548.04283: attempt loop complete, returning result 27885 1726882548.04286: _execute() done 27885 1726882548.04288: dumping result to json 27885 1726882548.04295: done dumping result, returning 27885 1726882548.04305: done running TaskExecutor() for managed_node2/TASK: Get the IPv6 routes from the route table main [12673a56-9f93-3fa5-01be-000000000062] 27885 1726882548.04311: sending task result for task 12673a56-9f93-3fa5-01be-000000000062 27885 1726882548.04403: done sending task result for task 12673a56-9f93-3fa5-01be-000000000062 27885 1726882548.04407: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ip", "-6", "route" ], "delta": "0:00:00.003401", "end": "2024-09-20 21:35:47.995450", "rc": 0, "start": "2024-09-20 21:35:47.992049" } STDOUT: 2001:db6::4 via 2001:db8::1 dev ethtest0 proto static metric 2 pref medium 2001:db8::/32 dev ethtest0 proto kernel metric 103 pref medium 2001:db8::/32 dev ethtest1 proto kernel metric 104 pref medium fe80::/64 dev peerethtest0 proto kernel metric 256 pref medium fe80::/64 dev peerethtest1 proto kernel metric 256 pref medium fe80::/64 dev eth0 proto kernel metric 1024 pref medium fe80::/64 dev ethtest0 proto kernel metric 1024 pref medium fe80::/64 dev ethtest1 proto kernel metric 1024 pref medium 27885 1726882548.04504: no more pending results, returning what we have 27885 1726882548.04509: results queue empty 27885 1726882548.04510: checking for any_errors_fatal 27885 1726882548.04517: done checking for any_errors_fatal 27885 1726882548.04518: checking for max_fail_percentage 27885 1726882548.04519: done checking for max_fail_percentage 27885 1726882548.04520: checking to see if all hosts have failed and the running result is not ok 27885 1726882548.04521: done checking to see if all hosts have failed 27885 1726882548.04522: getting the remaining hosts for this loop 27885 1726882548.04523: done getting the remaining hosts for this loop 27885 1726882548.04526: getting the next task for host managed_node2 27885 1726882548.04532: done getting next task for host managed_node2 27885 1726882548.04534: ^ task is: TASK: Assert that the route table main contains the specified IPv6 routes 27885 1726882548.04536: ^ state is: HOST STATE: block=3, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882548.04539: getting variables 27885 1726882548.04541: in VariableManager get_vars() 27885 1726882548.04577: Calling all_inventory to load vars for managed_node2 27885 1726882548.04579: Calling groups_inventory to load vars for managed_node2 27885 1726882548.04581: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882548.04599: Calling all_plugins_play to load vars for managed_node2 27885 1726882548.04603: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882548.04606: Calling groups_plugins_play to load vars for managed_node2 27885 1726882548.05466: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882548.07327: done with get_vars() 27885 1726882548.07349: done getting variables 27885 1726882548.07396: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the route table main contains the specified IPv6 routes] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:94 Friday 20 September 2024 21:35:48 -0400 (0:00:00.422) 0:00:20.716 ****** 27885 1726882548.07417: entering _queue_task() for managed_node2/assert 27885 1726882548.07698: worker is 1 (out of 1 available) 27885 1726882548.07712: exiting _queue_task() for managed_node2/assert 27885 1726882548.07723: done queuing things up, now waiting for results queue to drain 27885 1726882548.07724: waiting for pending results... 27885 1726882548.07898: running TaskExecutor() for managed_node2/TASK: Assert that the route table main contains the specified IPv6 routes 27885 1726882548.07958: in run() - task 12673a56-9f93-3fa5-01be-000000000063 27885 1726882548.07973: variable 'ansible_search_path' from source: unknown 27885 1726882548.08006: calling self._execute() 27885 1726882548.08084: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882548.08098: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882548.08102: variable 'omit' from source: magic vars 27885 1726882548.08380: variable 'ansible_distribution_major_version' from source: facts 27885 1726882548.08392: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882548.08397: variable 'omit' from source: magic vars 27885 1726882548.08419: variable 'omit' from source: magic vars 27885 1726882548.08446: variable 'omit' from source: magic vars 27885 1726882548.08476: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882548.08507: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882548.08525: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882548.08541: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882548.08550: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882548.08573: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882548.08576: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882548.08579: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882548.08653: Set connection var ansible_pipelining to False 27885 1726882548.08657: Set connection var ansible_connection to ssh 27885 1726882548.08662: Set connection var ansible_timeout to 10 27885 1726882548.08665: Set connection var ansible_shell_type to sh 27885 1726882548.08669: Set connection var ansible_shell_executable to /bin/sh 27885 1726882548.08675: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882548.08696: variable 'ansible_shell_executable' from source: unknown 27885 1726882548.08699: variable 'ansible_connection' from source: unknown 27885 1726882548.08702: variable 'ansible_module_compression' from source: unknown 27885 1726882548.08705: variable 'ansible_shell_type' from source: unknown 27885 1726882548.08707: variable 'ansible_shell_executable' from source: unknown 27885 1726882548.08711: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882548.08713: variable 'ansible_pipelining' from source: unknown 27885 1726882548.08715: variable 'ansible_timeout' from source: unknown 27885 1726882548.08718: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882548.08817: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882548.08825: variable 'omit' from source: magic vars 27885 1726882548.08830: starting attempt loop 27885 1726882548.08832: running the handler 27885 1726882548.08948: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27885 1726882548.09135: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27885 1726882548.09167: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27885 1726882548.09306: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27885 1726882548.09309: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27885 1726882548.09498: variable 'route_table_main_ipv6' from source: set_fact 27885 1726882548.09501: Evaluated conditional (route_table_main_ipv6.stdout is search("2001:db6::4 via 2001:db8::1 dev ethtest0\s+(proto static )?metric 2")): True 27885 1726882548.09503: handler run complete 27885 1726882548.09505: attempt loop complete, returning result 27885 1726882548.09507: _execute() done 27885 1726882548.09508: dumping result to json 27885 1726882548.09510: done dumping result, returning 27885 1726882548.09511: done running TaskExecutor() for managed_node2/TASK: Assert that the route table main contains the specified IPv6 routes [12673a56-9f93-3fa5-01be-000000000063] 27885 1726882548.09513: sending task result for task 12673a56-9f93-3fa5-01be-000000000063 27885 1726882548.09569: done sending task result for task 12673a56-9f93-3fa5-01be-000000000063 27885 1726882548.09572: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 27885 1726882548.09631: no more pending results, returning what we have 27885 1726882548.09634: results queue empty 27885 1726882548.09635: checking for any_errors_fatal 27885 1726882548.09642: done checking for any_errors_fatal 27885 1726882548.09642: checking for max_fail_percentage 27885 1726882548.09643: done checking for max_fail_percentage 27885 1726882548.09644: checking to see if all hosts have failed and the running result is not ok 27885 1726882548.09645: done checking to see if all hosts have failed 27885 1726882548.09646: getting the remaining hosts for this loop 27885 1726882548.09647: done getting the remaining hosts for this loop 27885 1726882548.09650: getting the next task for host managed_node2 27885 1726882548.09655: done getting next task for host managed_node2 27885 1726882548.09657: ^ task is: TASK: Get the interface1 MAC address 27885 1726882548.09659: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882548.09662: getting variables 27885 1726882548.09663: in VariableManager get_vars() 27885 1726882548.09772: Calling all_inventory to load vars for managed_node2 27885 1726882548.09776: Calling groups_inventory to load vars for managed_node2 27885 1726882548.09778: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882548.09787: Calling all_plugins_play to load vars for managed_node2 27885 1726882548.09790: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882548.09795: Calling groups_plugins_play to load vars for managed_node2 27885 1726882548.11196: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882548.12726: done with get_vars() 27885 1726882548.12747: done getting variables 27885 1726882548.12809: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get the interface1 MAC address] ****************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:99 Friday 20 September 2024 21:35:48 -0400 (0:00:00.054) 0:00:20.770 ****** 27885 1726882548.12837: entering _queue_task() for managed_node2/command 27885 1726882548.13163: worker is 1 (out of 1 available) 27885 1726882548.13174: exiting _queue_task() for managed_node2/command 27885 1726882548.13186: done queuing things up, now waiting for results queue to drain 27885 1726882548.13187: waiting for pending results... 27885 1726882548.13710: running TaskExecutor() for managed_node2/TASK: Get the interface1 MAC address 27885 1726882548.13715: in run() - task 12673a56-9f93-3fa5-01be-000000000064 27885 1726882548.13718: variable 'ansible_search_path' from source: unknown 27885 1726882548.13721: calling self._execute() 27885 1726882548.13724: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882548.13727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882548.13731: variable 'omit' from source: magic vars 27885 1726882548.14114: variable 'ansible_distribution_major_version' from source: facts 27885 1726882548.14125: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882548.14131: variable 'omit' from source: magic vars 27885 1726882548.14154: variable 'omit' from source: magic vars 27885 1726882548.14288: variable 'interface1' from source: play vars 27885 1726882548.14296: variable 'omit' from source: magic vars 27885 1726882548.14306: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882548.14343: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882548.14363: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882548.14400: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882548.14403: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882548.14507: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882548.14511: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882548.14515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882548.14540: Set connection var ansible_pipelining to False 27885 1726882548.14544: Set connection var ansible_connection to ssh 27885 1726882548.14551: Set connection var ansible_timeout to 10 27885 1726882548.14553: Set connection var ansible_shell_type to sh 27885 1726882548.14559: Set connection var ansible_shell_executable to /bin/sh 27885 1726882548.14564: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882548.14588: variable 'ansible_shell_executable' from source: unknown 27885 1726882548.14595: variable 'ansible_connection' from source: unknown 27885 1726882548.14598: variable 'ansible_module_compression' from source: unknown 27885 1726882548.14601: variable 'ansible_shell_type' from source: unknown 27885 1726882548.14614: variable 'ansible_shell_executable' from source: unknown 27885 1726882548.14617: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882548.14619: variable 'ansible_pipelining' from source: unknown 27885 1726882548.14622: variable 'ansible_timeout' from source: unknown 27885 1726882548.14624: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882548.14799: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882548.14803: variable 'omit' from source: magic vars 27885 1726882548.14805: starting attempt loop 27885 1726882548.14808: running the handler 27885 1726882548.14810: _low_level_execute_command(): starting 27885 1726882548.14812: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27885 1726882548.15604: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882548.15632: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882548.15729: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882548.17388: stdout chunk (state=3): >>>/root <<< 27885 1726882548.17495: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882548.17700: stderr chunk (state=3): >>><<< 27885 1726882548.17704: stdout chunk (state=3): >>><<< 27885 1726882548.17708: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882548.17711: _low_level_execute_command(): starting 27885 1726882548.17715: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882548.1755898-28870-215522926763551 `" && echo ansible-tmp-1726882548.1755898-28870-215522926763551="` echo /root/.ansible/tmp/ansible-tmp-1726882548.1755898-28870-215522926763551 `" ) && sleep 0' 27885 1726882548.18169: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882548.18177: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882548.18192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882548.18206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882548.18220: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882548.18228: stderr chunk (state=3): >>>debug2: match not found <<< 27885 1726882548.18274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882548.18289: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27885 1726882548.18296: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 27885 1726882548.18298: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27885 1726882548.18301: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882548.18303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882548.18305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882548.18384: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882548.18388: stderr chunk (state=3): >>>debug2: match found <<< 27885 1726882548.18394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882548.18397: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882548.18408: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882548.18419: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882548.18520: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882548.20749: stdout chunk (state=3): >>>ansible-tmp-1726882548.1755898-28870-215522926763551=/root/.ansible/tmp/ansible-tmp-1726882548.1755898-28870-215522926763551 <<< 27885 1726882548.20895: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882548.20929: stdout chunk (state=3): >>><<< 27885 1726882548.20933: stderr chunk (state=3): >>><<< 27885 1726882548.20952: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882548.1755898-28870-215522926763551=/root/.ansible/tmp/ansible-tmp-1726882548.1755898-28870-215522926763551 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882548.20987: variable 'ansible_module_compression' from source: unknown 27885 1726882548.21098: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27885_1t3g5hz/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27885 1726882548.21108: variable 'ansible_facts' from source: unknown 27885 1726882548.21199: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882548.1755898-28870-215522926763551/AnsiballZ_command.py 27885 1726882548.21553: Sending initial data 27885 1726882548.21556: Sent initial data (156 bytes) 27885 1726882548.22022: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882548.22039: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882548.22100: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882548.22105: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882548.22186: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882548.23712: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27885 1726882548.23720: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27885 1726882548.23770: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27885 1726882548.23830: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpsu_qgbvz /root/.ansible/tmp/ansible-tmp-1726882548.1755898-28870-215522926763551/AnsiballZ_command.py <<< 27885 1726882548.23835: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882548.1755898-28870-215522926763551/AnsiballZ_command.py" <<< 27885 1726882548.23888: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpsu_qgbvz" to remote "/root/.ansible/tmp/ansible-tmp-1726882548.1755898-28870-215522926763551/AnsiballZ_command.py" <<< 27885 1726882548.23895: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882548.1755898-28870-215522926763551/AnsiballZ_command.py" <<< 27885 1726882548.24498: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882548.24551: stderr chunk (state=3): >>><<< 27885 1726882548.24557: stdout chunk (state=3): >>><<< 27885 1726882548.24582: done transferring module to remote 27885 1726882548.24605: _low_level_execute_command(): starting 27885 1726882548.24609: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882548.1755898-28870-215522926763551/ /root/.ansible/tmp/ansible-tmp-1726882548.1755898-28870-215522926763551/AnsiballZ_command.py && sleep 0' 27885 1726882548.25232: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882548.25241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882548.25268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882548.25275: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882548.25277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882548.25365: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882548.25367: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882548.25369: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882548.25421: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882548.27132: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882548.27154: stderr chunk (state=3): >>><<< 27885 1726882548.27157: stdout chunk (state=3): >>><<< 27885 1726882548.27171: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882548.27174: _low_level_execute_command(): starting 27885 1726882548.27179: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882548.1755898-28870-215522926763551/AnsiballZ_command.py && sleep 0' 27885 1726882548.27592: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882548.27627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882548.27632: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882548.27635: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882548.27638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 27885 1726882548.27640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882548.27683: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882548.27687: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882548.27761: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882548.42847: stdout chunk (state=3): >>> {"changed": true, "stdout": "ca:0d:d0:ac:e2:a3", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/ethtest1/address"], "start": "2024-09-20 21:35:48.424748", "end": "2024-09-20 21:35:48.427691", "delta": "0:00:00.002943", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/ethtest1/address", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27885 1726882548.44206: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 27885 1726882548.44254: stderr chunk (state=3): >>><<< 27885 1726882548.44258: stdout chunk (state=3): >>><<< 27885 1726882548.44282: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "ca:0d:d0:ac:e2:a3", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/ethtest1/address"], "start": "2024-09-20 21:35:48.424748", "end": "2024-09-20 21:35:48.427691", "delta": "0:00:00.002943", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/ethtest1/address", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 27885 1726882548.44323: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/ethtest1/address', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882548.1755898-28870-215522926763551/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27885 1726882548.44336: _low_level_execute_command(): starting 27885 1726882548.44343: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882548.1755898-28870-215522926763551/ > /dev/null 2>&1 && sleep 0' 27885 1726882548.44997: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882548.45000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882548.45055: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882548.45063: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882548.45067: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882548.45134: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882548.47102: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882548.47106: stdout chunk (state=3): >>><<< 27885 1726882548.47109: stderr chunk (state=3): >>><<< 27885 1726882548.47111: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882548.47114: handler run complete 27885 1726882548.47116: Evaluated conditional (False): False 27885 1726882548.47118: attempt loop complete, returning result 27885 1726882548.47119: _execute() done 27885 1726882548.47121: dumping result to json 27885 1726882548.47123: done dumping result, returning 27885 1726882548.47125: done running TaskExecutor() for managed_node2/TASK: Get the interface1 MAC address [12673a56-9f93-3fa5-01be-000000000064] 27885 1726882548.47127: sending task result for task 12673a56-9f93-3fa5-01be-000000000064 ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/sys/class/net/ethtest1/address" ], "delta": "0:00:00.002943", "end": "2024-09-20 21:35:48.427691", "rc": 0, "start": "2024-09-20 21:35:48.424748" } STDOUT: ca:0d:d0:ac:e2:a3 27885 1726882548.47291: no more pending results, returning what we have 27885 1726882548.47297: results queue empty 27885 1726882548.47298: checking for any_errors_fatal 27885 1726882548.47309: done checking for any_errors_fatal 27885 1726882548.47310: checking for max_fail_percentage 27885 1726882548.47312: done checking for max_fail_percentage 27885 1726882548.47313: checking to see if all hosts have failed and the running result is not ok 27885 1726882548.47313: done checking to see if all hosts have failed 27885 1726882548.47314: getting the remaining hosts for this loop 27885 1726882548.47316: done getting the remaining hosts for this loop 27885 1726882548.47319: getting the next task for host managed_node2 27885 1726882548.47326: done getting next task for host managed_node2 27885 1726882548.47331: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 27885 1726882548.47334: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882548.47360: getting variables 27885 1726882548.47362: in VariableManager get_vars() 27885 1726882548.47428: Calling all_inventory to load vars for managed_node2 27885 1726882548.47431: Calling groups_inventory to load vars for managed_node2 27885 1726882548.47434: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882548.47446: done sending task result for task 12673a56-9f93-3fa5-01be-000000000064 27885 1726882548.47449: WORKER PROCESS EXITING 27885 1726882548.47463: Calling all_plugins_play to load vars for managed_node2 27885 1726882548.47469: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882548.47473: Calling groups_plugins_play to load vars for managed_node2 27885 1726882548.48373: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882548.49280: done with get_vars() 27885 1726882548.49298: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:35:48 -0400 (0:00:00.365) 0:00:21.136 ****** 27885 1726882548.49365: entering _queue_task() for managed_node2/include_tasks 27885 1726882548.49591: worker is 1 (out of 1 available) 27885 1726882548.49606: exiting _queue_task() for managed_node2/include_tasks 27885 1726882548.49619: done queuing things up, now waiting for results queue to drain 27885 1726882548.49634: waiting for pending results... 27885 1726882548.49884: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 27885 1726882548.50037: in run() - task 12673a56-9f93-3fa5-01be-00000000006c 27885 1726882548.50059: variable 'ansible_search_path' from source: unknown 27885 1726882548.50068: variable 'ansible_search_path' from source: unknown 27885 1726882548.50144: calling self._execute() 27885 1726882548.50229: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882548.50233: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882548.50252: variable 'omit' from source: magic vars 27885 1726882548.50530: variable 'ansible_distribution_major_version' from source: facts 27885 1726882548.50540: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882548.50546: _execute() done 27885 1726882548.50549: dumping result to json 27885 1726882548.50553: done dumping result, returning 27885 1726882548.50560: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-3fa5-01be-00000000006c] 27885 1726882548.50565: sending task result for task 12673a56-9f93-3fa5-01be-00000000006c 27885 1726882548.50655: done sending task result for task 12673a56-9f93-3fa5-01be-00000000006c 27885 1726882548.50658: WORKER PROCESS EXITING 27885 1726882548.50709: no more pending results, returning what we have 27885 1726882548.50716: in VariableManager get_vars() 27885 1726882548.50764: Calling all_inventory to load vars for managed_node2 27885 1726882548.50768: Calling groups_inventory to load vars for managed_node2 27885 1726882548.50771: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882548.50779: Calling all_plugins_play to load vars for managed_node2 27885 1726882548.50781: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882548.50784: Calling groups_plugins_play to load vars for managed_node2 27885 1726882548.51694: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882548.52682: done with get_vars() 27885 1726882548.52707: variable 'ansible_search_path' from source: unknown 27885 1726882548.52709: variable 'ansible_search_path' from source: unknown 27885 1726882548.52735: we have included files to process 27885 1726882548.52736: generating all_blocks data 27885 1726882548.52739: done generating all_blocks data 27885 1726882548.52743: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 27885 1726882548.52744: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 27885 1726882548.52745: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 27885 1726882548.53237: done processing included file 27885 1726882548.53238: iterating over new_blocks loaded from include file 27885 1726882548.53239: in VariableManager get_vars() 27885 1726882548.53254: done with get_vars() 27885 1726882548.53255: filtering new block on tags 27885 1726882548.53268: done filtering new block on tags 27885 1726882548.53269: in VariableManager get_vars() 27885 1726882548.53282: done with get_vars() 27885 1726882548.53283: filtering new block on tags 27885 1726882548.53297: done filtering new block on tags 27885 1726882548.53299: in VariableManager get_vars() 27885 1726882548.53313: done with get_vars() 27885 1726882548.53314: filtering new block on tags 27885 1726882548.53327: done filtering new block on tags 27885 1726882548.53328: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 27885 1726882548.53332: extending task lists for all hosts with included blocks 27885 1726882548.53825: done extending task lists 27885 1726882548.53826: done processing included files 27885 1726882548.53826: results queue empty 27885 1726882548.53827: checking for any_errors_fatal 27885 1726882548.53830: done checking for any_errors_fatal 27885 1726882548.53831: checking for max_fail_percentage 27885 1726882548.53831: done checking for max_fail_percentage 27885 1726882548.53832: checking to see if all hosts have failed and the running result is not ok 27885 1726882548.53832: done checking to see if all hosts have failed 27885 1726882548.53833: getting the remaining hosts for this loop 27885 1726882548.53833: done getting the remaining hosts for this loop 27885 1726882548.53835: getting the next task for host managed_node2 27885 1726882548.53837: done getting next task for host managed_node2 27885 1726882548.53839: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 27885 1726882548.53841: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882548.53847: getting variables 27885 1726882548.53848: in VariableManager get_vars() 27885 1726882548.53858: Calling all_inventory to load vars for managed_node2 27885 1726882548.53860: Calling groups_inventory to load vars for managed_node2 27885 1726882548.53864: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882548.53871: Calling all_plugins_play to load vars for managed_node2 27885 1726882548.53877: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882548.53880: Calling groups_plugins_play to load vars for managed_node2 27885 1726882548.54802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882548.55998: done with get_vars() 27885 1726882548.56020: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:35:48 -0400 (0:00:00.067) 0:00:21.203 ****** 27885 1726882548.56113: entering _queue_task() for managed_node2/setup 27885 1726882548.56431: worker is 1 (out of 1 available) 27885 1726882548.56444: exiting _queue_task() for managed_node2/setup 27885 1726882548.56457: done queuing things up, now waiting for results queue to drain 27885 1726882548.56458: waiting for pending results... 27885 1726882548.56649: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 27885 1726882548.56766: in run() - task 12673a56-9f93-3fa5-01be-000000000563 27885 1726882548.56778: variable 'ansible_search_path' from source: unknown 27885 1726882548.56781: variable 'ansible_search_path' from source: unknown 27885 1726882548.56827: calling self._execute() 27885 1726882548.56898: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882548.56905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882548.56913: variable 'omit' from source: magic vars 27885 1726882548.57217: variable 'ansible_distribution_major_version' from source: facts 27885 1726882548.57227: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882548.57373: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27885 1726882548.59750: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27885 1726882548.59870: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27885 1726882548.59927: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27885 1726882548.59941: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27885 1726882548.59984: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27885 1726882548.60085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882548.60116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882548.60150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882548.60166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882548.60177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882548.60260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882548.60306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882548.60325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882548.60381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882548.60389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882548.60613: variable '__network_required_facts' from source: role '' defaults 27885 1726882548.60617: variable 'ansible_facts' from source: unknown 27885 1726882548.61746: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 27885 1726882548.61750: when evaluation is False, skipping this task 27885 1726882548.61752: _execute() done 27885 1726882548.61754: dumping result to json 27885 1726882548.61755: done dumping result, returning 27885 1726882548.61758: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12673a56-9f93-3fa5-01be-000000000563] 27885 1726882548.61759: sending task result for task 12673a56-9f93-3fa5-01be-000000000563 27885 1726882548.61999: done sending task result for task 12673a56-9f93-3fa5-01be-000000000563 27885 1726882548.62003: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 27885 1726882548.62046: no more pending results, returning what we have 27885 1726882548.62049: results queue empty 27885 1726882548.62050: checking for any_errors_fatal 27885 1726882548.62051: done checking for any_errors_fatal 27885 1726882548.62052: checking for max_fail_percentage 27885 1726882548.62053: done checking for max_fail_percentage 27885 1726882548.62054: checking to see if all hosts have failed and the running result is not ok 27885 1726882548.62055: done checking to see if all hosts have failed 27885 1726882548.62056: getting the remaining hosts for this loop 27885 1726882548.62057: done getting the remaining hosts for this loop 27885 1726882548.62060: getting the next task for host managed_node2 27885 1726882548.62068: done getting next task for host managed_node2 27885 1726882548.62072: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 27885 1726882548.62077: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882548.62095: getting variables 27885 1726882548.62096: in VariableManager get_vars() 27885 1726882548.62134: Calling all_inventory to load vars for managed_node2 27885 1726882548.62137: Calling groups_inventory to load vars for managed_node2 27885 1726882548.62139: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882548.62147: Calling all_plugins_play to load vars for managed_node2 27885 1726882548.62150: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882548.62152: Calling groups_plugins_play to load vars for managed_node2 27885 1726882548.63989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882548.66408: done with get_vars() 27885 1726882548.66444: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:35:48 -0400 (0:00:00.104) 0:00:21.308 ****** 27885 1726882548.66559: entering _queue_task() for managed_node2/stat 27885 1726882548.66930: worker is 1 (out of 1 available) 27885 1726882548.66944: exiting _queue_task() for managed_node2/stat 27885 1726882548.66957: done queuing things up, now waiting for results queue to drain 27885 1726882548.66958: waiting for pending results... 27885 1726882548.67376: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 27885 1726882548.67435: in run() - task 12673a56-9f93-3fa5-01be-000000000565 27885 1726882548.67450: variable 'ansible_search_path' from source: unknown 27885 1726882548.67454: variable 'ansible_search_path' from source: unknown 27885 1726882548.67619: calling self._execute() 27885 1726882548.67622: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882548.67625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882548.67628: variable 'omit' from source: magic vars 27885 1726882548.68020: variable 'ansible_distribution_major_version' from source: facts 27885 1726882548.68032: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882548.68209: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27885 1726882548.68501: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27885 1726882548.68547: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27885 1726882548.68579: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27885 1726882548.68619: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27885 1726882548.68711: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27885 1726882548.68732: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27885 1726882548.68759: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882548.68786: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27885 1726882548.68892: variable '__network_is_ostree' from source: set_fact 27885 1726882548.68897: Evaluated conditional (not __network_is_ostree is defined): False 27885 1726882548.68900: when evaluation is False, skipping this task 27885 1726882548.68902: _execute() done 27885 1726882548.68905: dumping result to json 27885 1726882548.68907: done dumping result, returning 27885 1726882548.68910: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [12673a56-9f93-3fa5-01be-000000000565] 27885 1726882548.68912: sending task result for task 12673a56-9f93-3fa5-01be-000000000565 27885 1726882548.69064: done sending task result for task 12673a56-9f93-3fa5-01be-000000000565 27885 1726882548.69067: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 27885 1726882548.69125: no more pending results, returning what we have 27885 1726882548.69130: results queue empty 27885 1726882548.69131: checking for any_errors_fatal 27885 1726882548.69140: done checking for any_errors_fatal 27885 1726882548.69141: checking for max_fail_percentage 27885 1726882548.69142: done checking for max_fail_percentage 27885 1726882548.69143: checking to see if all hosts have failed and the running result is not ok 27885 1726882548.69144: done checking to see if all hosts have failed 27885 1726882548.69145: getting the remaining hosts for this loop 27885 1726882548.69147: done getting the remaining hosts for this loop 27885 1726882548.69150: getting the next task for host managed_node2 27885 1726882548.69158: done getting next task for host managed_node2 27885 1726882548.69162: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 27885 1726882548.69168: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882548.69187: getting variables 27885 1726882548.69189: in VariableManager get_vars() 27885 1726882548.69236: Calling all_inventory to load vars for managed_node2 27885 1726882548.69239: Calling groups_inventory to load vars for managed_node2 27885 1726882548.69242: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882548.69253: Calling all_plugins_play to load vars for managed_node2 27885 1726882548.69256: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882548.69260: Calling groups_plugins_play to load vars for managed_node2 27885 1726882548.70941: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882548.72483: done with get_vars() 27885 1726882548.72513: done getting variables 27885 1726882548.72573: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:35:48 -0400 (0:00:00.060) 0:00:21.368 ****** 27885 1726882548.72616: entering _queue_task() for managed_node2/set_fact 27885 1726882548.72950: worker is 1 (out of 1 available) 27885 1726882548.72962: exiting _queue_task() for managed_node2/set_fact 27885 1726882548.72973: done queuing things up, now waiting for results queue to drain 27885 1726882548.72975: waiting for pending results... 27885 1726882548.73413: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 27885 1726882548.73425: in run() - task 12673a56-9f93-3fa5-01be-000000000566 27885 1726882548.73429: variable 'ansible_search_path' from source: unknown 27885 1726882548.73432: variable 'ansible_search_path' from source: unknown 27885 1726882548.73499: calling self._execute() 27885 1726882548.73570: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882548.73580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882548.73597: variable 'omit' from source: magic vars 27885 1726882548.73980: variable 'ansible_distribution_major_version' from source: facts 27885 1726882548.74001: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882548.74165: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27885 1726882548.74435: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27885 1726882548.74483: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27885 1726882548.74532: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27885 1726882548.74571: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27885 1726882548.74801: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27885 1726882548.74804: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27885 1726882548.74807: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882548.74809: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27885 1726882548.74851: variable '__network_is_ostree' from source: set_fact 27885 1726882548.74863: Evaluated conditional (not __network_is_ostree is defined): False 27885 1726882548.74871: when evaluation is False, skipping this task 27885 1726882548.74879: _execute() done 27885 1726882548.74886: dumping result to json 27885 1726882548.74900: done dumping result, returning 27885 1726882548.74916: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12673a56-9f93-3fa5-01be-000000000566] 27885 1726882548.74927: sending task result for task 12673a56-9f93-3fa5-01be-000000000566 skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 27885 1726882548.75073: no more pending results, returning what we have 27885 1726882548.75077: results queue empty 27885 1726882548.75078: checking for any_errors_fatal 27885 1726882548.75086: done checking for any_errors_fatal 27885 1726882548.75087: checking for max_fail_percentage 27885 1726882548.75092: done checking for max_fail_percentage 27885 1726882548.75095: checking to see if all hosts have failed and the running result is not ok 27885 1726882548.75096: done checking to see if all hosts have failed 27885 1726882548.75097: getting the remaining hosts for this loop 27885 1726882548.75099: done getting the remaining hosts for this loop 27885 1726882548.75103: getting the next task for host managed_node2 27885 1726882548.75114: done getting next task for host managed_node2 27885 1726882548.75117: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 27885 1726882548.75122: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882548.75141: getting variables 27885 1726882548.75143: in VariableManager get_vars() 27885 1726882548.75186: Calling all_inventory to load vars for managed_node2 27885 1726882548.75296: Calling groups_inventory to load vars for managed_node2 27885 1726882548.75301: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882548.75308: done sending task result for task 12673a56-9f93-3fa5-01be-000000000566 27885 1726882548.75311: WORKER PROCESS EXITING 27885 1726882548.75321: Calling all_plugins_play to load vars for managed_node2 27885 1726882548.75324: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882548.75328: Calling groups_plugins_play to load vars for managed_node2 27885 1726882548.76860: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882548.78427: done with get_vars() 27885 1726882548.78448: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:35:48 -0400 (0:00:00.059) 0:00:21.427 ****** 27885 1726882548.78547: entering _queue_task() for managed_node2/service_facts 27885 1726882548.78838: worker is 1 (out of 1 available) 27885 1726882548.78849: exiting _queue_task() for managed_node2/service_facts 27885 1726882548.78862: done queuing things up, now waiting for results queue to drain 27885 1726882548.78863: waiting for pending results... 27885 1726882548.79151: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 27885 1726882548.79312: in run() - task 12673a56-9f93-3fa5-01be-000000000568 27885 1726882548.79333: variable 'ansible_search_path' from source: unknown 27885 1726882548.79341: variable 'ansible_search_path' from source: unknown 27885 1726882548.79381: calling self._execute() 27885 1726882548.79482: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882548.79502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882548.79518: variable 'omit' from source: magic vars 27885 1726882548.79897: variable 'ansible_distribution_major_version' from source: facts 27885 1726882548.79915: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882548.79927: variable 'omit' from source: magic vars 27885 1726882548.80010: variable 'omit' from source: magic vars 27885 1726882548.80052: variable 'omit' from source: magic vars 27885 1726882548.80102: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882548.80144: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882548.80169: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882548.80201: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882548.80219: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882548.80299: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882548.80303: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882548.80305: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882548.80375: Set connection var ansible_pipelining to False 27885 1726882548.80385: Set connection var ansible_connection to ssh 27885 1726882548.80404: Set connection var ansible_timeout to 10 27885 1726882548.80414: Set connection var ansible_shell_type to sh 27885 1726882548.80427: Set connection var ansible_shell_executable to /bin/sh 27885 1726882548.80437: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882548.80462: variable 'ansible_shell_executable' from source: unknown 27885 1726882548.80470: variable 'ansible_connection' from source: unknown 27885 1726882548.80519: variable 'ansible_module_compression' from source: unknown 27885 1726882548.80522: variable 'ansible_shell_type' from source: unknown 27885 1726882548.80524: variable 'ansible_shell_executable' from source: unknown 27885 1726882548.80527: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882548.80528: variable 'ansible_pipelining' from source: unknown 27885 1726882548.80530: variable 'ansible_timeout' from source: unknown 27885 1726882548.80532: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882548.80710: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 27885 1726882548.80727: variable 'omit' from source: magic vars 27885 1726882548.80742: starting attempt loop 27885 1726882548.80749: running the handler 27885 1726882548.80845: _low_level_execute_command(): starting 27885 1726882548.80848: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27885 1726882548.81519: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882548.81618: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882548.81636: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882548.81656: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882548.81680: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882548.81804: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882548.83446: stdout chunk (state=3): >>>/root <<< 27885 1726882548.83550: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882548.83598: stderr chunk (state=3): >>><<< 27885 1726882548.83601: stdout chunk (state=3): >>><<< 27885 1726882548.83612: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882548.83798: _low_level_execute_command(): starting 27885 1726882548.83804: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882548.8361104-28897-222729235907966 `" && echo ansible-tmp-1726882548.8361104-28897-222729235907966="` echo /root/.ansible/tmp/ansible-tmp-1726882548.8361104-28897-222729235907966 `" ) && sleep 0' 27885 1726882548.84238: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882548.84469: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882548.84471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882548.84474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882548.84476: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882548.84478: stderr chunk (state=3): >>>debug2: match not found <<< 27885 1726882548.84486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882548.84488: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27885 1726882548.84494: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 27885 1726882548.84497: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27885 1726882548.84500: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882548.84502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882548.84515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882548.84517: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882548.84519: stderr chunk (state=3): >>>debug2: match found <<< 27885 1726882548.84521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882548.84522: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882548.84524: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882548.84526: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882548.84739: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882548.86622: stdout chunk (state=3): >>>ansible-tmp-1726882548.8361104-28897-222729235907966=/root/.ansible/tmp/ansible-tmp-1726882548.8361104-28897-222729235907966 <<< 27885 1726882548.86765: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882548.86769: stdout chunk (state=3): >>><<< 27885 1726882548.86771: stderr chunk (state=3): >>><<< 27885 1726882548.86999: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882548.8361104-28897-222729235907966=/root/.ansible/tmp/ansible-tmp-1726882548.8361104-28897-222729235907966 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882548.87002: variable 'ansible_module_compression' from source: unknown 27885 1726882548.87005: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27885_1t3g5hz/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 27885 1726882548.87007: variable 'ansible_facts' from source: unknown 27885 1726882548.87041: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882548.8361104-28897-222729235907966/AnsiballZ_service_facts.py 27885 1726882548.87261: Sending initial data 27885 1726882548.87264: Sent initial data (162 bytes) 27885 1726882548.87852: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882548.87855: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882548.87914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882548.87932: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882548.87944: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882548.87953: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882548.88050: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882548.89583: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27885 1726882548.89625: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27885 1726882548.89726: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmp41xqiq43 /root/.ansible/tmp/ansible-tmp-1726882548.8361104-28897-222729235907966/AnsiballZ_service_facts.py <<< 27885 1726882548.89729: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882548.8361104-28897-222729235907966/AnsiballZ_service_facts.py" <<< 27885 1726882548.89840: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmp41xqiq43" to remote "/root/.ansible/tmp/ansible-tmp-1726882548.8361104-28897-222729235907966/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882548.8361104-28897-222729235907966/AnsiballZ_service_facts.py" <<< 27885 1726882548.90757: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882548.90779: stderr chunk (state=3): >>><<< 27885 1726882548.90782: stdout chunk (state=3): >>><<< 27885 1726882548.90873: done transferring module to remote 27885 1726882548.90972: _low_level_execute_command(): starting 27885 1726882548.90975: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882548.8361104-28897-222729235907966/ /root/.ansible/tmp/ansible-tmp-1726882548.8361104-28897-222729235907966/AnsiballZ_service_facts.py && sleep 0' 27885 1726882548.91475: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882548.91478: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882548.91488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882548.91583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882548.91631: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882548.91696: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882548.93557: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882548.93561: stdout chunk (state=3): >>><<< 27885 1726882548.93563: stderr chunk (state=3): >>><<< 27885 1726882548.93565: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882548.93567: _low_level_execute_command(): starting 27885 1726882548.93570: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882548.8361104-28897-222729235907966/AnsiballZ_service_facts.py && sleep 0' 27885 1726882548.94133: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882548.94137: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882548.94139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882548.94142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882548.94144: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882548.94146: stderr chunk (state=3): >>>debug2: match not found <<< 27885 1726882548.94148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882548.94150: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27885 1726882548.94152: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 27885 1726882548.94154: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27885 1726882548.94156: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882548.94158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882548.94160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882548.94162: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882548.94164: stderr chunk (state=3): >>>debug2: match found <<< 27885 1726882548.94169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882548.94238: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882548.94246: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882548.94267: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882548.94356: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882550.44440: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 27885 1726882550.45869: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 27885 1726882550.45873: stdout chunk (state=3): >>><<< 27885 1726882550.45876: stderr chunk (state=3): >>><<< 27885 1726882550.46107: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 27885 1726882550.47030: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882548.8361104-28897-222729235907966/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27885 1726882550.47048: _low_level_execute_command(): starting 27885 1726882550.47060: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882548.8361104-28897-222729235907966/ > /dev/null 2>&1 && sleep 0' 27885 1726882550.48395: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882550.48540: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882550.48559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882550.48618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882550.48734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882550.48940: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882550.49069: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882550.50986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882550.50990: stdout chunk (state=3): >>><<< 27885 1726882550.50992: stderr chunk (state=3): >>><<< 27885 1726882550.51222: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882550.51226: handler run complete 27885 1726882550.51631: variable 'ansible_facts' from source: unknown 27885 1726882550.51933: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882550.53300: variable 'ansible_facts' from source: unknown 27885 1726882550.53473: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882550.53921: attempt loop complete, returning result 27885 1726882550.53960: _execute() done 27885 1726882550.53970: dumping result to json 27885 1726882550.54043: done dumping result, returning 27885 1726882550.54063: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [12673a56-9f93-3fa5-01be-000000000568] 27885 1726882550.54073: sending task result for task 12673a56-9f93-3fa5-01be-000000000568 27885 1726882550.56171: done sending task result for task 12673a56-9f93-3fa5-01be-000000000568 27885 1726882550.56175: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 27885 1726882550.56287: no more pending results, returning what we have 27885 1726882550.56294: results queue empty 27885 1726882550.56295: checking for any_errors_fatal 27885 1726882550.56303: done checking for any_errors_fatal 27885 1726882550.56304: checking for max_fail_percentage 27885 1726882550.56305: done checking for max_fail_percentage 27885 1726882550.56306: checking to see if all hosts have failed and the running result is not ok 27885 1726882550.56307: done checking to see if all hosts have failed 27885 1726882550.56307: getting the remaining hosts for this loop 27885 1726882550.56309: done getting the remaining hosts for this loop 27885 1726882550.56312: getting the next task for host managed_node2 27885 1726882550.56317: done getting next task for host managed_node2 27885 1726882550.56320: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 27885 1726882550.56328: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882550.56340: getting variables 27885 1726882550.56341: in VariableManager get_vars() 27885 1726882550.56372: Calling all_inventory to load vars for managed_node2 27885 1726882550.56375: Calling groups_inventory to load vars for managed_node2 27885 1726882550.56378: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882550.56386: Calling all_plugins_play to load vars for managed_node2 27885 1726882550.56392: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882550.56430: Calling groups_plugins_play to load vars for managed_node2 27885 1726882550.58572: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882550.60715: done with get_vars() 27885 1726882550.60761: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:35:50 -0400 (0:00:01.823) 0:00:23.251 ****** 27885 1726882550.60929: entering _queue_task() for managed_node2/package_facts 27885 1726882550.61465: worker is 1 (out of 1 available) 27885 1726882550.61477: exiting _queue_task() for managed_node2/package_facts 27885 1726882550.61490: done queuing things up, now waiting for results queue to drain 27885 1726882550.61491: waiting for pending results... 27885 1726882550.61999: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 27885 1726882550.62202: in run() - task 12673a56-9f93-3fa5-01be-000000000569 27885 1726882550.62244: variable 'ansible_search_path' from source: unknown 27885 1726882550.62310: variable 'ansible_search_path' from source: unknown 27885 1726882550.62314: calling self._execute() 27885 1726882550.62644: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882550.62754: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882550.62758: variable 'omit' from source: magic vars 27885 1726882550.63397: variable 'ansible_distribution_major_version' from source: facts 27885 1726882550.63416: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882550.63431: variable 'omit' from source: magic vars 27885 1726882550.63518: variable 'omit' from source: magic vars 27885 1726882550.63586: variable 'omit' from source: magic vars 27885 1726882550.63732: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882550.63801: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882550.63947: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882550.63951: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882550.63953: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882550.64082: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882550.64092: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882550.64102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882550.64216: Set connection var ansible_pipelining to False 27885 1726882550.64264: Set connection var ansible_connection to ssh 27885 1726882550.64267: Set connection var ansible_timeout to 10 27885 1726882550.64270: Set connection var ansible_shell_type to sh 27885 1726882550.64272: Set connection var ansible_shell_executable to /bin/sh 27885 1726882550.64300: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882550.64373: variable 'ansible_shell_executable' from source: unknown 27885 1726882550.64376: variable 'ansible_connection' from source: unknown 27885 1726882550.64380: variable 'ansible_module_compression' from source: unknown 27885 1726882550.64382: variable 'ansible_shell_type' from source: unknown 27885 1726882550.64384: variable 'ansible_shell_executable' from source: unknown 27885 1726882550.64386: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882550.64388: variable 'ansible_pipelining' from source: unknown 27885 1726882550.64390: variable 'ansible_timeout' from source: unknown 27885 1726882550.64392: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882550.64857: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 27885 1726882550.65012: variable 'omit' from source: magic vars 27885 1726882550.65031: starting attempt loop 27885 1726882550.65037: running the handler 27885 1726882550.65054: _low_level_execute_command(): starting 27885 1726882550.65064: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27885 1726882550.66382: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882550.66429: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882550.66514: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882550.66539: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882550.66556: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882550.66579: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882550.66669: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882550.68289: stdout chunk (state=3): >>>/root <<< 27885 1726882550.68459: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882550.68462: stdout chunk (state=3): >>><<< 27885 1726882550.68466: stderr chunk (state=3): >>><<< 27885 1726882550.68661: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882550.68665: _low_level_execute_command(): starting 27885 1726882550.68667: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882550.6853473-28960-5735984079060 `" && echo ansible-tmp-1726882550.6853473-28960-5735984079060="` echo /root/.ansible/tmp/ansible-tmp-1726882550.6853473-28960-5735984079060 `" ) && sleep 0' 27885 1726882550.69671: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882550.69674: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882550.69729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882550.69733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882550.69736: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882550.69744: stderr chunk (state=3): >>>debug2: match not found <<< 27885 1726882550.69799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882550.69837: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882550.69921: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882550.71738: stdout chunk (state=3): >>>ansible-tmp-1726882550.6853473-28960-5735984079060=/root/.ansible/tmp/ansible-tmp-1726882550.6853473-28960-5735984079060 <<< 27885 1726882550.71904: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882550.71908: stdout chunk (state=3): >>><<< 27885 1726882550.71911: stderr chunk (state=3): >>><<< 27885 1726882550.71928: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882550.6853473-28960-5735984079060=/root/.ansible/tmp/ansible-tmp-1726882550.6853473-28960-5735984079060 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882550.72000: variable 'ansible_module_compression' from source: unknown 27885 1726882550.72115: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27885_1t3g5hz/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 27885 1726882550.72199: variable 'ansible_facts' from source: unknown 27885 1726882550.72404: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882550.6853473-28960-5735984079060/AnsiballZ_package_facts.py 27885 1726882550.72753: Sending initial data 27885 1726882550.72769: Sent initial data (160 bytes) 27885 1726882550.73622: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882550.73938: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882550.74009: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882550.74092: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882550.75762: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27885 1726882550.75847: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27885 1726882550.75920: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpp7dt7bp3 /root/.ansible/tmp/ansible-tmp-1726882550.6853473-28960-5735984079060/AnsiballZ_package_facts.py <<< 27885 1726882550.75924: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882550.6853473-28960-5735984079060/AnsiballZ_package_facts.py" <<< 27885 1726882550.75990: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpp7dt7bp3" to remote "/root/.ansible/tmp/ansible-tmp-1726882550.6853473-28960-5735984079060/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882550.6853473-28960-5735984079060/AnsiballZ_package_facts.py" <<< 27885 1726882550.78902: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882550.78906: stdout chunk (state=3): >>><<< 27885 1726882550.78913: stderr chunk (state=3): >>><<< 27885 1726882550.78972: done transferring module to remote 27885 1726882550.78986: _low_level_execute_command(): starting 27885 1726882550.78989: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882550.6853473-28960-5735984079060/ /root/.ansible/tmp/ansible-tmp-1726882550.6853473-28960-5735984079060/AnsiballZ_package_facts.py && sleep 0' 27885 1726882550.79797: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882550.79811: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882550.79909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882550.79937: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882550.79949: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882550.79967: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882550.80053: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882550.81898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882550.81902: stderr chunk (state=3): >>><<< 27885 1726882550.81904: stdout chunk (state=3): >>><<< 27885 1726882550.81906: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882550.81909: _low_level_execute_command(): starting 27885 1726882550.81912: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882550.6853473-28960-5735984079060/AnsiballZ_package_facts.py && sleep 0' 27885 1726882550.82422: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882550.82431: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882550.82448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882550.82461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882550.82473: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882550.82480: stderr chunk (state=3): >>>debug2: match not found <<< 27885 1726882550.82492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882550.82506: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27885 1726882550.82515: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 27885 1726882550.82522: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27885 1726882550.82529: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882550.82539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882550.82558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882550.82611: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882550.82692: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882550.82698: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882550.82701: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882550.82770: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882551.26244: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 27885 1726882551.26349: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 27885 1726882551.28330: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 27885 1726882551.28335: stdout chunk (state=3): >>><<< 27885 1726882551.28338: stderr chunk (state=3): >>><<< 27885 1726882551.28347: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 27885 1726882551.32900: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882550.6853473-28960-5735984079060/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27885 1726882551.33314: _low_level_execute_command(): starting 27885 1726882551.33317: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882550.6853473-28960-5735984079060/ > /dev/null 2>&1 && sleep 0' 27885 1726882551.34567: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882551.34571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882551.34671: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 27885 1726882551.34674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882551.34708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882551.34804: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882551.34807: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882551.34899: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882551.36878: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882551.36881: stdout chunk (state=3): >>><<< 27885 1726882551.36888: stderr chunk (state=3): >>><<< 27885 1726882551.36906: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882551.36912: handler run complete 27885 1726882551.38621: variable 'ansible_facts' from source: unknown 27885 1726882551.39617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882551.42873: variable 'ansible_facts' from source: unknown 27885 1726882551.43242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882551.44178: attempt loop complete, returning result 27885 1726882551.44192: _execute() done 27885 1726882551.44197: dumping result to json 27885 1726882551.44612: done dumping result, returning 27885 1726882551.44615: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [12673a56-9f93-3fa5-01be-000000000569] 27885 1726882551.44617: sending task result for task 12673a56-9f93-3fa5-01be-000000000569 27885 1726882551.46997: done sending task result for task 12673a56-9f93-3fa5-01be-000000000569 27885 1726882551.47002: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 27885 1726882551.47155: no more pending results, returning what we have 27885 1726882551.47159: results queue empty 27885 1726882551.47160: checking for any_errors_fatal 27885 1726882551.47166: done checking for any_errors_fatal 27885 1726882551.47167: checking for max_fail_percentage 27885 1726882551.47168: done checking for max_fail_percentage 27885 1726882551.47169: checking to see if all hosts have failed and the running result is not ok 27885 1726882551.47170: done checking to see if all hosts have failed 27885 1726882551.47171: getting the remaining hosts for this loop 27885 1726882551.47172: done getting the remaining hosts for this loop 27885 1726882551.47176: getting the next task for host managed_node2 27885 1726882551.47182: done getting next task for host managed_node2 27885 1726882551.47185: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 27885 1726882551.47191: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882551.47205: getting variables 27885 1726882551.47206: in VariableManager get_vars() 27885 1726882551.47240: Calling all_inventory to load vars for managed_node2 27885 1726882551.47243: Calling groups_inventory to load vars for managed_node2 27885 1726882551.47246: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882551.47255: Calling all_plugins_play to load vars for managed_node2 27885 1726882551.47258: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882551.47261: Calling groups_plugins_play to load vars for managed_node2 27885 1726882551.48706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882551.50564: done with get_vars() 27885 1726882551.50599: done getting variables 27885 1726882551.50665: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:35:51 -0400 (0:00:00.897) 0:00:24.149 ****** 27885 1726882551.50712: entering _queue_task() for managed_node2/debug 27885 1726882551.51199: worker is 1 (out of 1 available) 27885 1726882551.51244: exiting _queue_task() for managed_node2/debug 27885 1726882551.51256: done queuing things up, now waiting for results queue to drain 27885 1726882551.51257: waiting for pending results... 27885 1726882551.51550: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 27885 1726882551.51671: in run() - task 12673a56-9f93-3fa5-01be-00000000006d 27885 1726882551.51701: variable 'ansible_search_path' from source: unknown 27885 1726882551.51711: variable 'ansible_search_path' from source: unknown 27885 1726882551.51799: calling self._execute() 27885 1726882551.51871: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882551.51883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882551.51906: variable 'omit' from source: magic vars 27885 1726882551.52379: variable 'ansible_distribution_major_version' from source: facts 27885 1726882551.52435: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882551.52452: variable 'omit' from source: magic vars 27885 1726882551.52592: variable 'omit' from source: magic vars 27885 1726882551.52704: variable 'network_provider' from source: set_fact 27885 1726882551.52756: variable 'omit' from source: magic vars 27885 1726882551.53083: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882551.53169: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882551.53173: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882551.53176: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882551.53178: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882551.53286: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882551.53302: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882551.53496: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882551.53499: Set connection var ansible_pipelining to False 27885 1726882551.53610: Set connection var ansible_connection to ssh 27885 1726882551.53708: Set connection var ansible_timeout to 10 27885 1726882551.53711: Set connection var ansible_shell_type to sh 27885 1726882551.53713: Set connection var ansible_shell_executable to /bin/sh 27885 1726882551.53716: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882551.53718: variable 'ansible_shell_executable' from source: unknown 27885 1726882551.53721: variable 'ansible_connection' from source: unknown 27885 1726882551.53724: variable 'ansible_module_compression' from source: unknown 27885 1726882551.53726: variable 'ansible_shell_type' from source: unknown 27885 1726882551.53728: variable 'ansible_shell_executable' from source: unknown 27885 1726882551.53730: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882551.53732: variable 'ansible_pipelining' from source: unknown 27885 1726882551.53734: variable 'ansible_timeout' from source: unknown 27885 1726882551.53736: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882551.54070: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882551.54073: variable 'omit' from source: magic vars 27885 1726882551.54216: starting attempt loop 27885 1726882551.54220: running the handler 27885 1726882551.54222: handler run complete 27885 1726882551.54252: attempt loop complete, returning result 27885 1726882551.54325: _execute() done 27885 1726882551.54328: dumping result to json 27885 1726882551.54330: done dumping result, returning 27885 1726882551.54333: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-3fa5-01be-00000000006d] 27885 1726882551.54335: sending task result for task 12673a56-9f93-3fa5-01be-00000000006d ok: [managed_node2] => {} MSG: Using network provider: nm 27885 1726882551.54721: no more pending results, returning what we have 27885 1726882551.54726: results queue empty 27885 1726882551.54727: checking for any_errors_fatal 27885 1726882551.54738: done checking for any_errors_fatal 27885 1726882551.54739: checking for max_fail_percentage 27885 1726882551.54740: done checking for max_fail_percentage 27885 1726882551.54741: checking to see if all hosts have failed and the running result is not ok 27885 1726882551.54742: done checking to see if all hosts have failed 27885 1726882551.54743: getting the remaining hosts for this loop 27885 1726882551.54744: done getting the remaining hosts for this loop 27885 1726882551.54748: getting the next task for host managed_node2 27885 1726882551.54755: done getting next task for host managed_node2 27885 1726882551.54759: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 27885 1726882551.54763: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882551.54776: getting variables 27885 1726882551.54777: in VariableManager get_vars() 27885 1726882551.55029: Calling all_inventory to load vars for managed_node2 27885 1726882551.55032: Calling groups_inventory to load vars for managed_node2 27885 1726882551.55034: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882551.55094: Calling all_plugins_play to load vars for managed_node2 27885 1726882551.55099: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882551.55103: Calling groups_plugins_play to load vars for managed_node2 27885 1726882551.55129: done sending task result for task 12673a56-9f93-3fa5-01be-00000000006d 27885 1726882551.55132: WORKER PROCESS EXITING 27885 1726882551.56516: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882551.58006: done with get_vars() 27885 1726882551.58023: done getting variables 27885 1726882551.58071: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:35:51 -0400 (0:00:00.073) 0:00:24.223 ****** 27885 1726882551.58100: entering _queue_task() for managed_node2/fail 27885 1726882551.58337: worker is 1 (out of 1 available) 27885 1726882551.58350: exiting _queue_task() for managed_node2/fail 27885 1726882551.58362: done queuing things up, now waiting for results queue to drain 27885 1726882551.58364: waiting for pending results... 27885 1726882551.58551: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 27885 1726882551.58681: in run() - task 12673a56-9f93-3fa5-01be-00000000006e 27885 1726882551.58705: variable 'ansible_search_path' from source: unknown 27885 1726882551.58716: variable 'ansible_search_path' from source: unknown 27885 1726882551.58758: calling self._execute() 27885 1726882551.58865: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882551.58878: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882551.58900: variable 'omit' from source: magic vars 27885 1726882551.59255: variable 'ansible_distribution_major_version' from source: facts 27885 1726882551.59488: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882551.59496: variable 'network_state' from source: role '' defaults 27885 1726882551.59498: Evaluated conditional (network_state != {}): False 27885 1726882551.59501: when evaluation is False, skipping this task 27885 1726882551.59503: _execute() done 27885 1726882551.59544: dumping result to json 27885 1726882551.59555: done dumping result, returning 27885 1726882551.59558: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-3fa5-01be-00000000006e] 27885 1726882551.59561: sending task result for task 12673a56-9f93-3fa5-01be-00000000006e 27885 1726882551.59632: done sending task result for task 12673a56-9f93-3fa5-01be-00000000006e 27885 1726882551.59635: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 27885 1726882551.59696: no more pending results, returning what we have 27885 1726882551.59700: results queue empty 27885 1726882551.59701: checking for any_errors_fatal 27885 1726882551.59706: done checking for any_errors_fatal 27885 1726882551.59706: checking for max_fail_percentage 27885 1726882551.59708: done checking for max_fail_percentage 27885 1726882551.59708: checking to see if all hosts have failed and the running result is not ok 27885 1726882551.59709: done checking to see if all hosts have failed 27885 1726882551.59710: getting the remaining hosts for this loop 27885 1726882551.59711: done getting the remaining hosts for this loop 27885 1726882551.59714: getting the next task for host managed_node2 27885 1726882551.59718: done getting next task for host managed_node2 27885 1726882551.59720: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 27885 1726882551.59723: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882551.59734: getting variables 27885 1726882551.59735: in VariableManager get_vars() 27885 1726882551.59768: Calling all_inventory to load vars for managed_node2 27885 1726882551.59771: Calling groups_inventory to load vars for managed_node2 27885 1726882551.59775: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882551.59784: Calling all_plugins_play to load vars for managed_node2 27885 1726882551.59787: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882551.59794: Calling groups_plugins_play to load vars for managed_node2 27885 1726882551.61035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882551.62151: done with get_vars() 27885 1726882551.62168: done getting variables 27885 1726882551.62214: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:35:51 -0400 (0:00:00.041) 0:00:24.264 ****** 27885 1726882551.62237: entering _queue_task() for managed_node2/fail 27885 1726882551.62475: worker is 1 (out of 1 available) 27885 1726882551.62492: exiting _queue_task() for managed_node2/fail 27885 1726882551.62506: done queuing things up, now waiting for results queue to drain 27885 1726882551.62507: waiting for pending results... 27885 1726882551.62679: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 27885 1726882551.62771: in run() - task 12673a56-9f93-3fa5-01be-00000000006f 27885 1726882551.62782: variable 'ansible_search_path' from source: unknown 27885 1726882551.62785: variable 'ansible_search_path' from source: unknown 27885 1726882551.62818: calling self._execute() 27885 1726882551.62936: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882551.62939: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882551.62943: variable 'omit' from source: magic vars 27885 1726882551.63498: variable 'ansible_distribution_major_version' from source: facts 27885 1726882551.63501: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882551.63504: variable 'network_state' from source: role '' defaults 27885 1726882551.63507: Evaluated conditional (network_state != {}): False 27885 1726882551.63510: when evaluation is False, skipping this task 27885 1726882551.63512: _execute() done 27885 1726882551.63514: dumping result to json 27885 1726882551.63517: done dumping result, returning 27885 1726882551.63520: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-3fa5-01be-00000000006f] 27885 1726882551.63523: sending task result for task 12673a56-9f93-3fa5-01be-00000000006f 27885 1726882551.63626: done sending task result for task 12673a56-9f93-3fa5-01be-00000000006f 27885 1726882551.63628: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 27885 1726882551.63677: no more pending results, returning what we have 27885 1726882551.63681: results queue empty 27885 1726882551.63682: checking for any_errors_fatal 27885 1726882551.63694: done checking for any_errors_fatal 27885 1726882551.63695: checking for max_fail_percentage 27885 1726882551.63696: done checking for max_fail_percentage 27885 1726882551.63697: checking to see if all hosts have failed and the running result is not ok 27885 1726882551.63698: done checking to see if all hosts have failed 27885 1726882551.63698: getting the remaining hosts for this loop 27885 1726882551.63700: done getting the remaining hosts for this loop 27885 1726882551.63704: getting the next task for host managed_node2 27885 1726882551.63710: done getting next task for host managed_node2 27885 1726882551.63714: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 27885 1726882551.63717: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882551.63736: getting variables 27885 1726882551.63738: in VariableManager get_vars() 27885 1726882551.63773: Calling all_inventory to load vars for managed_node2 27885 1726882551.63776: Calling groups_inventory to load vars for managed_node2 27885 1726882551.63778: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882551.63786: Calling all_plugins_play to load vars for managed_node2 27885 1726882551.63788: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882551.63880: Calling groups_plugins_play to load vars for managed_node2 27885 1726882551.65068: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882551.66254: done with get_vars() 27885 1726882551.66271: done getting variables 27885 1726882551.66316: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:35:51 -0400 (0:00:00.041) 0:00:24.305 ****** 27885 1726882551.66341: entering _queue_task() for managed_node2/fail 27885 1726882551.66573: worker is 1 (out of 1 available) 27885 1726882551.66585: exiting _queue_task() for managed_node2/fail 27885 1726882551.66599: done queuing things up, now waiting for results queue to drain 27885 1726882551.66601: waiting for pending results... 27885 1726882551.66790: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 27885 1726882551.66900: in run() - task 12673a56-9f93-3fa5-01be-000000000070 27885 1726882551.66913: variable 'ansible_search_path' from source: unknown 27885 1726882551.66917: variable 'ansible_search_path' from source: unknown 27885 1726882551.66946: calling self._execute() 27885 1726882551.67019: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882551.67024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882551.67033: variable 'omit' from source: magic vars 27885 1726882551.67308: variable 'ansible_distribution_major_version' from source: facts 27885 1726882551.67317: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882551.67435: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27885 1726882551.69303: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27885 1726882551.69349: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27885 1726882551.69376: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27885 1726882551.69406: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27885 1726882551.69426: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27885 1726882551.69486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882551.69511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882551.69528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882551.69557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882551.69567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882551.69633: variable 'ansible_distribution_major_version' from source: facts 27885 1726882551.69645: Evaluated conditional (ansible_distribution_major_version | int > 9): True 27885 1726882551.69723: variable 'ansible_distribution' from source: facts 27885 1726882551.69727: variable '__network_rh_distros' from source: role '' defaults 27885 1726882551.69735: Evaluated conditional (ansible_distribution in __network_rh_distros): True 27885 1726882551.69891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882551.69911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882551.69927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882551.69951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882551.69962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882551.70000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882551.70017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882551.70033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882551.70057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882551.70067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882551.70101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882551.70118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882551.70134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882551.70157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882551.70167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882551.70372: variable 'network_connections' from source: task vars 27885 1726882551.70381: variable 'interface1' from source: play vars 27885 1726882551.70440: variable 'interface1' from source: play vars 27885 1726882551.70488: variable 'interface1_mac' from source: set_fact 27885 1726882551.70508: variable 'network_state' from source: role '' defaults 27885 1726882551.70553: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27885 1726882551.70661: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27885 1726882551.70687: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27885 1726882551.70714: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27885 1726882551.70734: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27885 1726882551.70767: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27885 1726882551.70785: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27885 1726882551.70807: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882551.70825: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27885 1726882551.70855: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 27885 1726882551.70859: when evaluation is False, skipping this task 27885 1726882551.70861: _execute() done 27885 1726882551.70864: dumping result to json 27885 1726882551.70866: done dumping result, returning 27885 1726882551.70869: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-3fa5-01be-000000000070] 27885 1726882551.70874: sending task result for task 12673a56-9f93-3fa5-01be-000000000070 27885 1726882551.70957: done sending task result for task 12673a56-9f93-3fa5-01be-000000000070 27885 1726882551.70960: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 27885 1726882551.71023: no more pending results, returning what we have 27885 1726882551.71026: results queue empty 27885 1726882551.71027: checking for any_errors_fatal 27885 1726882551.71033: done checking for any_errors_fatal 27885 1726882551.71034: checking for max_fail_percentage 27885 1726882551.71036: done checking for max_fail_percentage 27885 1726882551.71036: checking to see if all hosts have failed and the running result is not ok 27885 1726882551.71037: done checking to see if all hosts have failed 27885 1726882551.71038: getting the remaining hosts for this loop 27885 1726882551.71040: done getting the remaining hosts for this loop 27885 1726882551.71043: getting the next task for host managed_node2 27885 1726882551.71050: done getting next task for host managed_node2 27885 1726882551.71054: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 27885 1726882551.71057: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882551.71075: getting variables 27885 1726882551.71077: in VariableManager get_vars() 27885 1726882551.71115: Calling all_inventory to load vars for managed_node2 27885 1726882551.71117: Calling groups_inventory to load vars for managed_node2 27885 1726882551.71119: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882551.71128: Calling all_plugins_play to load vars for managed_node2 27885 1726882551.71130: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882551.71132: Calling groups_plugins_play to load vars for managed_node2 27885 1726882551.72028: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882551.75956: done with get_vars() 27885 1726882551.75973: done getting variables 27885 1726882551.76012: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:35:51 -0400 (0:00:00.096) 0:00:24.402 ****** 27885 1726882551.76032: entering _queue_task() for managed_node2/dnf 27885 1726882551.76286: worker is 1 (out of 1 available) 27885 1726882551.76301: exiting _queue_task() for managed_node2/dnf 27885 1726882551.76314: done queuing things up, now waiting for results queue to drain 27885 1726882551.76316: waiting for pending results... 27885 1726882551.76501: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 27885 1726882551.76588: in run() - task 12673a56-9f93-3fa5-01be-000000000071 27885 1726882551.76603: variable 'ansible_search_path' from source: unknown 27885 1726882551.76607: variable 'ansible_search_path' from source: unknown 27885 1726882551.76635: calling self._execute() 27885 1726882551.76714: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882551.76718: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882551.76726: variable 'omit' from source: magic vars 27885 1726882551.77004: variable 'ansible_distribution_major_version' from source: facts 27885 1726882551.77013: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882551.77148: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27885 1726882551.78933: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27885 1726882551.78975: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27885 1726882551.79017: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27885 1726882551.79053: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27885 1726882551.79147: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27885 1726882551.79159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882551.79190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882551.79222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882551.79263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882551.79276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882551.79389: variable 'ansible_distribution' from source: facts 27885 1726882551.79396: variable 'ansible_distribution_major_version' from source: facts 27885 1726882551.79499: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 27885 1726882551.79522: variable '__network_wireless_connections_defined' from source: role '' defaults 27885 1726882551.79651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882551.79676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882551.79706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882551.79798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882551.79802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882551.79804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882551.79825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882551.79848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882551.79899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882551.79905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882551.79999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882551.80003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882551.80062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882551.80066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882551.80101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882551.80340: variable 'network_connections' from source: task vars 27885 1726882551.80352: variable 'interface1' from source: play vars 27885 1726882551.80450: variable 'interface1' from source: play vars 27885 1726882551.80533: variable 'interface1_mac' from source: set_fact 27885 1726882551.80699: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27885 1726882551.80794: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27885 1726882551.80835: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27885 1726882551.80867: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27885 1726882551.80900: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27885 1726882551.80942: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27885 1726882551.80966: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27885 1726882551.80995: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882551.81033: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27885 1726882551.81085: variable '__network_team_connections_defined' from source: role '' defaults 27885 1726882551.81364: variable 'network_connections' from source: task vars 27885 1726882551.81367: variable 'interface1' from source: play vars 27885 1726882551.81499: variable 'interface1' from source: play vars 27885 1726882551.81503: variable 'interface1_mac' from source: set_fact 27885 1726882551.81533: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 27885 1726882551.81537: when evaluation is False, skipping this task 27885 1726882551.81539: _execute() done 27885 1726882551.81542: dumping result to json 27885 1726882551.81544: done dumping result, returning 27885 1726882551.81552: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-3fa5-01be-000000000071] 27885 1726882551.81557: sending task result for task 12673a56-9f93-3fa5-01be-000000000071 27885 1726882551.81839: done sending task result for task 12673a56-9f93-3fa5-01be-000000000071 27885 1726882551.81842: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 27885 1726882551.81882: no more pending results, returning what we have 27885 1726882551.81885: results queue empty 27885 1726882551.81886: checking for any_errors_fatal 27885 1726882551.81895: done checking for any_errors_fatal 27885 1726882551.81896: checking for max_fail_percentage 27885 1726882551.81898: done checking for max_fail_percentage 27885 1726882551.81898: checking to see if all hosts have failed and the running result is not ok 27885 1726882551.81899: done checking to see if all hosts have failed 27885 1726882551.81900: getting the remaining hosts for this loop 27885 1726882551.81901: done getting the remaining hosts for this loop 27885 1726882551.81904: getting the next task for host managed_node2 27885 1726882551.81910: done getting next task for host managed_node2 27885 1726882551.81913: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 27885 1726882551.81916: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882551.81931: getting variables 27885 1726882551.81932: in VariableManager get_vars() 27885 1726882551.81975: Calling all_inventory to load vars for managed_node2 27885 1726882551.81978: Calling groups_inventory to load vars for managed_node2 27885 1726882551.81981: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882551.81989: Calling all_plugins_play to load vars for managed_node2 27885 1726882551.81992: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882551.82001: Calling groups_plugins_play to load vars for managed_node2 27885 1726882551.83288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882551.84187: done with get_vars() 27885 1726882551.84209: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 27885 1726882551.84263: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:35:51 -0400 (0:00:00.082) 0:00:24.485 ****** 27885 1726882551.84292: entering _queue_task() for managed_node2/yum 27885 1726882551.84544: worker is 1 (out of 1 available) 27885 1726882551.84558: exiting _queue_task() for managed_node2/yum 27885 1726882551.84571: done queuing things up, now waiting for results queue to drain 27885 1726882551.84572: waiting for pending results... 27885 1726882551.84759: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 27885 1726882551.84866: in run() - task 12673a56-9f93-3fa5-01be-000000000072 27885 1726882551.84879: variable 'ansible_search_path' from source: unknown 27885 1726882551.84883: variable 'ansible_search_path' from source: unknown 27885 1726882551.84918: calling self._execute() 27885 1726882551.84995: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882551.85004: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882551.85013: variable 'omit' from source: magic vars 27885 1726882551.85295: variable 'ansible_distribution_major_version' from source: facts 27885 1726882551.85306: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882551.85429: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27885 1726882551.86941: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27885 1726882551.87232: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27885 1726882551.87258: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27885 1726882551.87283: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27885 1726882551.87310: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27885 1726882551.87367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882551.87386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882551.87410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882551.87437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882551.87448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882551.87517: variable 'ansible_distribution_major_version' from source: facts 27885 1726882551.87528: Evaluated conditional (ansible_distribution_major_version | int < 8): False 27885 1726882551.87536: when evaluation is False, skipping this task 27885 1726882551.87540: _execute() done 27885 1726882551.87542: dumping result to json 27885 1726882551.87544: done dumping result, returning 27885 1726882551.87552: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-3fa5-01be-000000000072] 27885 1726882551.87556: sending task result for task 12673a56-9f93-3fa5-01be-000000000072 27885 1726882551.87645: done sending task result for task 12673a56-9f93-3fa5-01be-000000000072 27885 1726882551.87648: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 27885 1726882551.87699: no more pending results, returning what we have 27885 1726882551.87702: results queue empty 27885 1726882551.87703: checking for any_errors_fatal 27885 1726882551.87710: done checking for any_errors_fatal 27885 1726882551.87710: checking for max_fail_percentage 27885 1726882551.87712: done checking for max_fail_percentage 27885 1726882551.87712: checking to see if all hosts have failed and the running result is not ok 27885 1726882551.87713: done checking to see if all hosts have failed 27885 1726882551.87714: getting the remaining hosts for this loop 27885 1726882551.87715: done getting the remaining hosts for this loop 27885 1726882551.87719: getting the next task for host managed_node2 27885 1726882551.87725: done getting next task for host managed_node2 27885 1726882551.87729: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 27885 1726882551.87732: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882551.87749: getting variables 27885 1726882551.87750: in VariableManager get_vars() 27885 1726882551.87796: Calling all_inventory to load vars for managed_node2 27885 1726882551.87799: Calling groups_inventory to load vars for managed_node2 27885 1726882551.87802: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882551.87811: Calling all_plugins_play to load vars for managed_node2 27885 1726882551.87813: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882551.87816: Calling groups_plugins_play to load vars for managed_node2 27885 1726882551.88740: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882551.89590: done with get_vars() 27885 1726882551.89608: done getting variables 27885 1726882551.89649: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:35:51 -0400 (0:00:00.053) 0:00:24.539 ****** 27885 1726882551.89671: entering _queue_task() for managed_node2/fail 27885 1726882551.89910: worker is 1 (out of 1 available) 27885 1726882551.89924: exiting _queue_task() for managed_node2/fail 27885 1726882551.89936: done queuing things up, now waiting for results queue to drain 27885 1726882551.89937: waiting for pending results... 27885 1726882551.90121: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 27885 1726882551.90210: in run() - task 12673a56-9f93-3fa5-01be-000000000073 27885 1726882551.90222: variable 'ansible_search_path' from source: unknown 27885 1726882551.90226: variable 'ansible_search_path' from source: unknown 27885 1726882551.90253: calling self._execute() 27885 1726882551.90334: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882551.90340: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882551.90348: variable 'omit' from source: magic vars 27885 1726882551.90625: variable 'ansible_distribution_major_version' from source: facts 27885 1726882551.90635: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882551.90720: variable '__network_wireless_connections_defined' from source: role '' defaults 27885 1726882551.90851: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27885 1726882551.92307: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27885 1726882551.92356: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27885 1726882551.92383: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27885 1726882551.92414: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27885 1726882551.92434: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27885 1726882551.92494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882551.92517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882551.92534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882551.92562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882551.92572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882551.92609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882551.92626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882551.92642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882551.92667: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882551.92684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882551.92714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882551.92730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882551.92746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882551.92769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882551.92784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882551.92890: variable 'network_connections' from source: task vars 27885 1726882551.92906: variable 'interface1' from source: play vars 27885 1726882551.92955: variable 'interface1' from source: play vars 27885 1726882551.93017: variable 'interface1_mac' from source: set_fact 27885 1726882551.93070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27885 1726882551.93200: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27885 1726882551.93230: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27885 1726882551.93253: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27885 1726882551.93274: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27885 1726882551.93313: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27885 1726882551.93331: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27885 1726882551.93349: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882551.93366: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27885 1726882551.93412: variable '__network_team_connections_defined' from source: role '' defaults 27885 1726882551.93563: variable 'network_connections' from source: task vars 27885 1726882551.93566: variable 'interface1' from source: play vars 27885 1726882551.93612: variable 'interface1' from source: play vars 27885 1726882551.93663: variable 'interface1_mac' from source: set_fact 27885 1726882551.93690: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 27885 1726882551.93696: when evaluation is False, skipping this task 27885 1726882551.93699: _execute() done 27885 1726882551.93703: dumping result to json 27885 1726882551.93706: done dumping result, returning 27885 1726882551.93713: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-3fa5-01be-000000000073] 27885 1726882551.93724: sending task result for task 12673a56-9f93-3fa5-01be-000000000073 27885 1726882551.93802: done sending task result for task 12673a56-9f93-3fa5-01be-000000000073 27885 1726882551.93805: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 27885 1726882551.93851: no more pending results, returning what we have 27885 1726882551.93855: results queue empty 27885 1726882551.93856: checking for any_errors_fatal 27885 1726882551.93860: done checking for any_errors_fatal 27885 1726882551.93861: checking for max_fail_percentage 27885 1726882551.93863: done checking for max_fail_percentage 27885 1726882551.93863: checking to see if all hosts have failed and the running result is not ok 27885 1726882551.93864: done checking to see if all hosts have failed 27885 1726882551.93864: getting the remaining hosts for this loop 27885 1726882551.93867: done getting the remaining hosts for this loop 27885 1726882551.93870: getting the next task for host managed_node2 27885 1726882551.93876: done getting next task for host managed_node2 27885 1726882551.93879: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 27885 1726882551.93882: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882551.93902: getting variables 27885 1726882551.93904: in VariableManager get_vars() 27885 1726882551.93943: Calling all_inventory to load vars for managed_node2 27885 1726882551.93945: Calling groups_inventory to load vars for managed_node2 27885 1726882551.93948: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882551.93956: Calling all_plugins_play to load vars for managed_node2 27885 1726882551.93958: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882551.93961: Calling groups_plugins_play to load vars for managed_node2 27885 1726882551.94745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882551.95618: done with get_vars() 27885 1726882551.95635: done getting variables 27885 1726882551.95674: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:35:51 -0400 (0:00:00.060) 0:00:24.599 ****** 27885 1726882551.95700: entering _queue_task() for managed_node2/package 27885 1726882551.95924: worker is 1 (out of 1 available) 27885 1726882551.95937: exiting _queue_task() for managed_node2/package 27885 1726882551.95950: done queuing things up, now waiting for results queue to drain 27885 1726882551.95951: waiting for pending results... 27885 1726882551.96126: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 27885 1726882551.96214: in run() - task 12673a56-9f93-3fa5-01be-000000000074 27885 1726882551.96226: variable 'ansible_search_path' from source: unknown 27885 1726882551.96230: variable 'ansible_search_path' from source: unknown 27885 1726882551.96256: calling self._execute() 27885 1726882551.96339: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882551.96343: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882551.96351: variable 'omit' from source: magic vars 27885 1726882551.96624: variable 'ansible_distribution_major_version' from source: facts 27885 1726882551.96634: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882551.96763: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27885 1726882551.97013: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27885 1726882551.97016: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27885 1726882551.97043: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27885 1726882551.97098: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27885 1726882551.97170: variable 'network_packages' from source: role '' defaults 27885 1726882551.97246: variable '__network_provider_setup' from source: role '' defaults 27885 1726882551.97254: variable '__network_service_name_default_nm' from source: role '' defaults 27885 1726882551.97308: variable '__network_service_name_default_nm' from source: role '' defaults 27885 1726882551.97315: variable '__network_packages_default_nm' from source: role '' defaults 27885 1726882551.97358: variable '__network_packages_default_nm' from source: role '' defaults 27885 1726882551.97471: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27885 1726882551.99052: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27885 1726882551.99298: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27885 1726882551.99301: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27885 1726882551.99304: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27885 1726882551.99306: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27885 1726882551.99308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882551.99334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882551.99364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882551.99412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882551.99430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882551.99475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882551.99511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882551.99538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882551.99580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882551.99604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882551.99823: variable '__network_packages_default_gobject_packages' from source: role '' defaults 27885 1726882551.99934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882551.99962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882551.99997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882552.00043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882552.00068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882552.00130: variable 'ansible_python' from source: facts 27885 1726882552.00150: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 27885 1726882552.00212: variable '__network_wpa_supplicant_required' from source: role '' defaults 27885 1726882552.00267: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 27885 1726882552.00353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882552.00370: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882552.00387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882552.00415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882552.00427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882552.00460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882552.00480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882552.00501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882552.00525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882552.00535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882552.00634: variable 'network_connections' from source: task vars 27885 1726882552.00638: variable 'interface1' from source: play vars 27885 1726882552.00713: variable 'interface1' from source: play vars 27885 1726882552.00791: variable 'interface1_mac' from source: set_fact 27885 1726882552.00853: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27885 1726882552.00872: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27885 1726882552.00897: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882552.00922: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27885 1726882552.00955: variable '__network_wireless_connections_defined' from source: role '' defaults 27885 1726882552.01135: variable 'network_connections' from source: task vars 27885 1726882552.01139: variable 'interface1' from source: play vars 27885 1726882552.01210: variable 'interface1' from source: play vars 27885 1726882552.01289: variable 'interface1_mac' from source: set_fact 27885 1726882552.01342: variable '__network_packages_default_wireless' from source: role '' defaults 27885 1726882552.01392: variable '__network_wireless_connections_defined' from source: role '' defaults 27885 1726882552.01589: variable 'network_connections' from source: task vars 27885 1726882552.01596: variable 'interface1' from source: play vars 27885 1726882552.01643: variable 'interface1' from source: play vars 27885 1726882552.01702: variable 'interface1_mac' from source: set_fact 27885 1726882552.01723: variable '__network_packages_default_team' from source: role '' defaults 27885 1726882552.01778: variable '__network_team_connections_defined' from source: role '' defaults 27885 1726882552.02068: variable 'network_connections' from source: task vars 27885 1726882552.02298: variable 'interface1' from source: play vars 27885 1726882552.02301: variable 'interface1' from source: play vars 27885 1726882552.02303: variable 'interface1_mac' from source: set_fact 27885 1726882552.02305: variable '__network_service_name_default_initscripts' from source: role '' defaults 27885 1726882552.02356: variable '__network_service_name_default_initscripts' from source: role '' defaults 27885 1726882552.02367: variable '__network_packages_default_initscripts' from source: role '' defaults 27885 1726882552.02431: variable '__network_packages_default_initscripts' from source: role '' defaults 27885 1726882552.02650: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 27885 1726882552.03084: variable 'network_connections' from source: task vars 27885 1726882552.03099: variable 'interface1' from source: play vars 27885 1726882552.03159: variable 'interface1' from source: play vars 27885 1726882552.03236: variable 'interface1_mac' from source: set_fact 27885 1726882552.03253: variable 'ansible_distribution' from source: facts 27885 1726882552.03260: variable '__network_rh_distros' from source: role '' defaults 27885 1726882552.03270: variable 'ansible_distribution_major_version' from source: facts 27885 1726882552.03299: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 27885 1726882552.03461: variable 'ansible_distribution' from source: facts 27885 1726882552.03470: variable '__network_rh_distros' from source: role '' defaults 27885 1726882552.03479: variable 'ansible_distribution_major_version' from source: facts 27885 1726882552.03501: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 27885 1726882552.03661: variable 'ansible_distribution' from source: facts 27885 1726882552.03669: variable '__network_rh_distros' from source: role '' defaults 27885 1726882552.03678: variable 'ansible_distribution_major_version' from source: facts 27885 1726882552.03720: variable 'network_provider' from source: set_fact 27885 1726882552.03739: variable 'ansible_facts' from source: unknown 27885 1726882552.04410: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 27885 1726882552.04422: when evaluation is False, skipping this task 27885 1726882552.04431: _execute() done 27885 1726882552.04440: dumping result to json 27885 1726882552.04448: done dumping result, returning 27885 1726882552.04460: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-3fa5-01be-000000000074] 27885 1726882552.04469: sending task result for task 12673a56-9f93-3fa5-01be-000000000074 skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 27885 1726882552.04627: no more pending results, returning what we have 27885 1726882552.04631: results queue empty 27885 1726882552.04632: checking for any_errors_fatal 27885 1726882552.04640: done checking for any_errors_fatal 27885 1726882552.04641: checking for max_fail_percentage 27885 1726882552.04643: done checking for max_fail_percentage 27885 1726882552.04643: checking to see if all hosts have failed and the running result is not ok 27885 1726882552.04644: done checking to see if all hosts have failed 27885 1726882552.04645: getting the remaining hosts for this loop 27885 1726882552.04646: done getting the remaining hosts for this loop 27885 1726882552.04650: getting the next task for host managed_node2 27885 1726882552.04656: done getting next task for host managed_node2 27885 1726882552.04659: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 27885 1726882552.04662: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882552.04679: getting variables 27885 1726882552.04681: in VariableManager get_vars() 27885 1726882552.04720: Calling all_inventory to load vars for managed_node2 27885 1726882552.04723: Calling groups_inventory to load vars for managed_node2 27885 1726882552.04725: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882552.04735: Calling all_plugins_play to load vars for managed_node2 27885 1726882552.04742: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882552.04745: Calling groups_plugins_play to load vars for managed_node2 27885 1726882552.05309: done sending task result for task 12673a56-9f93-3fa5-01be-000000000074 27885 1726882552.05312: WORKER PROCESS EXITING 27885 1726882552.06346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882552.07949: done with get_vars() 27885 1726882552.07976: done getting variables 27885 1726882552.08039: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:35:52 -0400 (0:00:00.123) 0:00:24.723 ****** 27885 1726882552.08078: entering _queue_task() for managed_node2/package 27885 1726882552.08455: worker is 1 (out of 1 available) 27885 1726882552.08468: exiting _queue_task() for managed_node2/package 27885 1726882552.08481: done queuing things up, now waiting for results queue to drain 27885 1726882552.08598: waiting for pending results... 27885 1726882552.08791: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 27885 1726882552.08957: in run() - task 12673a56-9f93-3fa5-01be-000000000075 27885 1726882552.08979: variable 'ansible_search_path' from source: unknown 27885 1726882552.08987: variable 'ansible_search_path' from source: unknown 27885 1726882552.09032: calling self._execute() 27885 1726882552.09151: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882552.09259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882552.09264: variable 'omit' from source: magic vars 27885 1726882552.09568: variable 'ansible_distribution_major_version' from source: facts 27885 1726882552.09590: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882552.09711: variable 'network_state' from source: role '' defaults 27885 1726882552.09725: Evaluated conditional (network_state != {}): False 27885 1726882552.09731: when evaluation is False, skipping this task 27885 1726882552.09737: _execute() done 27885 1726882552.09743: dumping result to json 27885 1726882552.09749: done dumping result, returning 27885 1726882552.09758: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-3fa5-01be-000000000075] 27885 1726882552.09766: sending task result for task 12673a56-9f93-3fa5-01be-000000000075 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 27885 1726882552.09947: no more pending results, returning what we have 27885 1726882552.09952: results queue empty 27885 1726882552.09953: checking for any_errors_fatal 27885 1726882552.09959: done checking for any_errors_fatal 27885 1726882552.09960: checking for max_fail_percentage 27885 1726882552.09962: done checking for max_fail_percentage 27885 1726882552.09962: checking to see if all hosts have failed and the running result is not ok 27885 1726882552.09963: done checking to see if all hosts have failed 27885 1726882552.09964: getting the remaining hosts for this loop 27885 1726882552.09966: done getting the remaining hosts for this loop 27885 1726882552.09969: getting the next task for host managed_node2 27885 1726882552.09977: done getting next task for host managed_node2 27885 1726882552.09980: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 27885 1726882552.09984: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882552.10005: getting variables 27885 1726882552.10007: in VariableManager get_vars() 27885 1726882552.10048: Calling all_inventory to load vars for managed_node2 27885 1726882552.10051: Calling groups_inventory to load vars for managed_node2 27885 1726882552.10053: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882552.10065: Calling all_plugins_play to load vars for managed_node2 27885 1726882552.10068: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882552.10070: Calling groups_plugins_play to load vars for managed_node2 27885 1726882552.10706: done sending task result for task 12673a56-9f93-3fa5-01be-000000000075 27885 1726882552.10710: WORKER PROCESS EXITING 27885 1726882552.11756: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882552.13446: done with get_vars() 27885 1726882552.13471: done getting variables 27885 1726882552.13534: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:35:52 -0400 (0:00:00.054) 0:00:24.778 ****** 27885 1726882552.13567: entering _queue_task() for managed_node2/package 27885 1726882552.14105: worker is 1 (out of 1 available) 27885 1726882552.14115: exiting _queue_task() for managed_node2/package 27885 1726882552.14125: done queuing things up, now waiting for results queue to drain 27885 1726882552.14126: waiting for pending results... 27885 1726882552.14210: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 27885 1726882552.14361: in run() - task 12673a56-9f93-3fa5-01be-000000000076 27885 1726882552.14381: variable 'ansible_search_path' from source: unknown 27885 1726882552.14390: variable 'ansible_search_path' from source: unknown 27885 1726882552.14433: calling self._execute() 27885 1726882552.14540: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882552.14554: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882552.14577: variable 'omit' from source: magic vars 27885 1726882552.14956: variable 'ansible_distribution_major_version' from source: facts 27885 1726882552.14972: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882552.15100: variable 'network_state' from source: role '' defaults 27885 1726882552.15122: Evaluated conditional (network_state != {}): False 27885 1726882552.15130: when evaluation is False, skipping this task 27885 1726882552.15136: _execute() done 27885 1726882552.15144: dumping result to json 27885 1726882552.15151: done dumping result, returning 27885 1726882552.15161: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-3fa5-01be-000000000076] 27885 1726882552.15171: sending task result for task 12673a56-9f93-3fa5-01be-000000000076 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 27885 1726882552.15430: no more pending results, returning what we have 27885 1726882552.15439: results queue empty 27885 1726882552.15440: checking for any_errors_fatal 27885 1726882552.15450: done checking for any_errors_fatal 27885 1726882552.15451: checking for max_fail_percentage 27885 1726882552.15453: done checking for max_fail_percentage 27885 1726882552.15453: checking to see if all hosts have failed and the running result is not ok 27885 1726882552.15455: done checking to see if all hosts have failed 27885 1726882552.15455: getting the remaining hosts for this loop 27885 1726882552.15457: done getting the remaining hosts for this loop 27885 1726882552.15460: getting the next task for host managed_node2 27885 1726882552.15467: done getting next task for host managed_node2 27885 1726882552.15472: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 27885 1726882552.15476: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882552.15498: done sending task result for task 12673a56-9f93-3fa5-01be-000000000076 27885 1726882552.15502: WORKER PROCESS EXITING 27885 1726882552.15516: getting variables 27885 1726882552.15518: in VariableManager get_vars() 27885 1726882552.15743: Calling all_inventory to load vars for managed_node2 27885 1726882552.15746: Calling groups_inventory to load vars for managed_node2 27885 1726882552.15748: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882552.15757: Calling all_plugins_play to load vars for managed_node2 27885 1726882552.15759: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882552.15762: Calling groups_plugins_play to load vars for managed_node2 27885 1726882552.17040: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882552.18627: done with get_vars() 27885 1726882552.18651: done getting variables 27885 1726882552.18717: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:35:52 -0400 (0:00:00.051) 0:00:24.829 ****** 27885 1726882552.18752: entering _queue_task() for managed_node2/service 27885 1726882552.19304: worker is 1 (out of 1 available) 27885 1726882552.19314: exiting _queue_task() for managed_node2/service 27885 1726882552.19325: done queuing things up, now waiting for results queue to drain 27885 1726882552.19326: waiting for pending results... 27885 1726882552.19408: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 27885 1726882552.19666: in run() - task 12673a56-9f93-3fa5-01be-000000000077 27885 1726882552.19671: variable 'ansible_search_path' from source: unknown 27885 1726882552.19674: variable 'ansible_search_path' from source: unknown 27885 1726882552.19676: calling self._execute() 27885 1726882552.19749: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882552.19761: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882552.19781: variable 'omit' from source: magic vars 27885 1726882552.20164: variable 'ansible_distribution_major_version' from source: facts 27885 1726882552.20180: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882552.20315: variable '__network_wireless_connections_defined' from source: role '' defaults 27885 1726882552.20520: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27885 1726882552.22852: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27885 1726882552.22946: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27885 1726882552.22991: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27885 1726882552.23045: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27885 1726882552.23071: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27885 1726882552.23263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882552.23267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882552.23270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882552.23272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882552.23296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882552.23349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882552.23385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882552.23416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882552.23460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882552.23488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882552.23534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882552.23563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882552.23597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882552.23634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882552.23649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882552.23828: variable 'network_connections' from source: task vars 27885 1726882552.23848: variable 'interface1' from source: play vars 27885 1726882552.24205: variable 'interface1' from source: play vars 27885 1726882552.24208: variable 'interface1_mac' from source: set_fact 27885 1726882552.24423: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27885 1726882552.25326: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27885 1726882552.25377: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27885 1726882552.25415: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27885 1726882552.25453: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27885 1726882552.25510: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27885 1726882552.25536: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27885 1726882552.25569: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882552.25605: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27885 1726882552.25668: variable '__network_team_connections_defined' from source: role '' defaults 27885 1726882552.25924: variable 'network_connections' from source: task vars 27885 1726882552.25934: variable 'interface1' from source: play vars 27885 1726882552.26000: variable 'interface1' from source: play vars 27885 1726882552.26117: variable 'interface1_mac' from source: set_fact 27885 1726882552.26126: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 27885 1726882552.26133: when evaluation is False, skipping this task 27885 1726882552.26140: _execute() done 27885 1726882552.26146: dumping result to json 27885 1726882552.26153: done dumping result, returning 27885 1726882552.26163: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-3fa5-01be-000000000077] 27885 1726882552.26181: sending task result for task 12673a56-9f93-3fa5-01be-000000000077 27885 1726882552.26522: done sending task result for task 12673a56-9f93-3fa5-01be-000000000077 27885 1726882552.26525: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 27885 1726882552.26572: no more pending results, returning what we have 27885 1726882552.26576: results queue empty 27885 1726882552.26578: checking for any_errors_fatal 27885 1726882552.26583: done checking for any_errors_fatal 27885 1726882552.26584: checking for max_fail_percentage 27885 1726882552.26585: done checking for max_fail_percentage 27885 1726882552.26586: checking to see if all hosts have failed and the running result is not ok 27885 1726882552.26587: done checking to see if all hosts have failed 27885 1726882552.26587: getting the remaining hosts for this loop 27885 1726882552.26589: done getting the remaining hosts for this loop 27885 1726882552.26595: getting the next task for host managed_node2 27885 1726882552.26602: done getting next task for host managed_node2 27885 1726882552.26606: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 27885 1726882552.26610: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882552.26638: getting variables 27885 1726882552.26641: in VariableManager get_vars() 27885 1726882552.26707: Calling all_inventory to load vars for managed_node2 27885 1726882552.26710: Calling groups_inventory to load vars for managed_node2 27885 1726882552.26712: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882552.26722: Calling all_plugins_play to load vars for managed_node2 27885 1726882552.26725: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882552.26728: Calling groups_plugins_play to load vars for managed_node2 27885 1726882552.28416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882552.32156: done with get_vars() 27885 1726882552.32186: done getting variables 27885 1726882552.32452: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:35:52 -0400 (0:00:00.139) 0:00:24.969 ****** 27885 1726882552.32716: entering _queue_task() for managed_node2/service 27885 1726882552.33262: worker is 1 (out of 1 available) 27885 1726882552.33274: exiting _queue_task() for managed_node2/service 27885 1726882552.33286: done queuing things up, now waiting for results queue to drain 27885 1726882552.33287: waiting for pending results... 27885 1726882552.33838: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 27885 1726882552.33999: in run() - task 12673a56-9f93-3fa5-01be-000000000078 27885 1726882552.34159: variable 'ansible_search_path' from source: unknown 27885 1726882552.34163: variable 'ansible_search_path' from source: unknown 27885 1726882552.34201: calling self._execute() 27885 1726882552.34412: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882552.34416: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882552.34474: variable 'omit' from source: magic vars 27885 1726882552.35247: variable 'ansible_distribution_major_version' from source: facts 27885 1726882552.35259: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882552.35464: variable 'network_provider' from source: set_fact 27885 1726882552.35468: variable 'network_state' from source: role '' defaults 27885 1726882552.35479: Evaluated conditional (network_provider == "nm" or network_state != {}): True 27885 1726882552.35486: variable 'omit' from source: magic vars 27885 1726882552.35898: variable 'omit' from source: magic vars 27885 1726882552.35901: variable 'network_service_name' from source: role '' defaults 27885 1726882552.35949: variable 'network_service_name' from source: role '' defaults 27885 1726882552.36207: variable '__network_provider_setup' from source: role '' defaults 27885 1726882552.36212: variable '__network_service_name_default_nm' from source: role '' defaults 27885 1726882552.36326: variable '__network_service_name_default_nm' from source: role '' defaults 27885 1726882552.36338: variable '__network_packages_default_nm' from source: role '' defaults 27885 1726882552.36514: variable '__network_packages_default_nm' from source: role '' defaults 27885 1726882552.36953: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27885 1726882552.41418: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27885 1726882552.41543: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27885 1726882552.41691: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27885 1726882552.41770: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27885 1726882552.41773: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27885 1726882552.41948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882552.41978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882552.42117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882552.42157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882552.42171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882552.42498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882552.42501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882552.42504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882552.42506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882552.42509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882552.43082: variable '__network_packages_default_gobject_packages' from source: role '' defaults 27885 1726882552.43311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882552.43336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882552.43356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882552.43498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882552.43501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882552.43590: variable 'ansible_python' from source: facts 27885 1726882552.43876: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 27885 1726882552.43879: variable '__network_wpa_supplicant_required' from source: role '' defaults 27885 1726882552.44081: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 27885 1726882552.44322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882552.44346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882552.44369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882552.44522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882552.44535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882552.44582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882552.44703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882552.44749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882552.44768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882552.44858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882552.45117: variable 'network_connections' from source: task vars 27885 1726882552.45125: variable 'interface1' from source: play vars 27885 1726882552.45313: variable 'interface1' from source: play vars 27885 1726882552.45516: variable 'interface1_mac' from source: set_fact 27885 1726882552.45742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27885 1726882552.46180: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27885 1726882552.46232: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27885 1726882552.46327: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27885 1726882552.46441: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27885 1726882552.46505: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27885 1726882552.46531: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27885 1726882552.46595: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882552.46738: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27885 1726882552.46835: variable '__network_wireless_connections_defined' from source: role '' defaults 27885 1726882552.47549: variable 'network_connections' from source: task vars 27885 1726882552.47555: variable 'interface1' from source: play vars 27885 1726882552.47745: variable 'interface1' from source: play vars 27885 1726882552.47877: variable 'interface1_mac' from source: set_fact 27885 1726882552.48066: variable '__network_packages_default_wireless' from source: role '' defaults 27885 1726882552.48213: variable '__network_wireless_connections_defined' from source: role '' defaults 27885 1726882552.48616: variable 'network_connections' from source: task vars 27885 1726882552.48620: variable 'interface1' from source: play vars 27885 1726882552.48622: variable 'interface1' from source: play vars 27885 1726882552.48709: variable 'interface1_mac' from source: set_fact 27885 1726882552.48733: variable '__network_packages_default_team' from source: role '' defaults 27885 1726882552.48820: variable '__network_team_connections_defined' from source: role '' defaults 27885 1726882552.49200: variable 'network_connections' from source: task vars 27885 1726882552.49203: variable 'interface1' from source: play vars 27885 1726882552.49206: variable 'interface1' from source: play vars 27885 1726882552.49263: variable 'interface1_mac' from source: set_fact 27885 1726882552.49334: variable '__network_service_name_default_initscripts' from source: role '' defaults 27885 1726882552.49389: variable '__network_service_name_default_initscripts' from source: role '' defaults 27885 1726882552.49402: variable '__network_packages_default_initscripts' from source: role '' defaults 27885 1726882552.49464: variable '__network_packages_default_initscripts' from source: role '' defaults 27885 1726882552.49743: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 27885 1726882552.50343: variable 'network_connections' from source: task vars 27885 1726882552.50346: variable 'interface1' from source: play vars 27885 1726882552.50412: variable 'interface1' from source: play vars 27885 1726882552.50479: variable 'interface1_mac' from source: set_fact 27885 1726882552.50495: variable 'ansible_distribution' from source: facts 27885 1726882552.50501: variable '__network_rh_distros' from source: role '' defaults 27885 1726882552.50512: variable 'ansible_distribution_major_version' from source: facts 27885 1726882552.50533: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 27885 1726882552.50710: variable 'ansible_distribution' from source: facts 27885 1726882552.50715: variable '__network_rh_distros' from source: role '' defaults 27885 1726882552.50718: variable 'ansible_distribution_major_version' from source: facts 27885 1726882552.50737: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 27885 1726882552.50941: variable 'ansible_distribution' from source: facts 27885 1726882552.50944: variable '__network_rh_distros' from source: role '' defaults 27885 1726882552.50946: variable 'ansible_distribution_major_version' from source: facts 27885 1726882552.50955: variable 'network_provider' from source: set_fact 27885 1726882552.50977: variable 'omit' from source: magic vars 27885 1726882552.51007: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882552.51036: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882552.51098: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882552.51101: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882552.51103: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882552.51112: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882552.51114: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882552.51119: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882552.51225: Set connection var ansible_pipelining to False 27885 1726882552.51229: Set connection var ansible_connection to ssh 27885 1726882552.51235: Set connection var ansible_timeout to 10 27885 1726882552.51238: Set connection var ansible_shell_type to sh 27885 1726882552.51265: Set connection var ansible_shell_executable to /bin/sh 27885 1726882552.51268: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882552.51276: variable 'ansible_shell_executable' from source: unknown 27885 1726882552.51279: variable 'ansible_connection' from source: unknown 27885 1726882552.51281: variable 'ansible_module_compression' from source: unknown 27885 1726882552.51284: variable 'ansible_shell_type' from source: unknown 27885 1726882552.51286: variable 'ansible_shell_executable' from source: unknown 27885 1726882552.51375: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882552.51378: variable 'ansible_pipelining' from source: unknown 27885 1726882552.51382: variable 'ansible_timeout' from source: unknown 27885 1726882552.51384: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882552.51407: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882552.51418: variable 'omit' from source: magic vars 27885 1726882552.51424: starting attempt loop 27885 1726882552.51427: running the handler 27885 1726882552.51507: variable 'ansible_facts' from source: unknown 27885 1726882552.52526: _low_level_execute_command(): starting 27885 1726882552.52533: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27885 1726882552.53336: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882552.53384: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882552.53417: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882552.53424: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882552.53526: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882552.55224: stdout chunk (state=3): >>>/root <<< 27885 1726882552.55388: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882552.55400: stdout chunk (state=3): >>><<< 27885 1726882552.55403: stderr chunk (state=3): >>><<< 27885 1726882552.55599: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882552.55603: _low_level_execute_command(): starting 27885 1726882552.55606: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882552.5547924-29034-163182331614317 `" && echo ansible-tmp-1726882552.5547924-29034-163182331614317="` echo /root/.ansible/tmp/ansible-tmp-1726882552.5547924-29034-163182331614317 `" ) && sleep 0' 27885 1726882552.56706: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882552.56769: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882552.56933: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882552.58808: stdout chunk (state=3): >>>ansible-tmp-1726882552.5547924-29034-163182331614317=/root/.ansible/tmp/ansible-tmp-1726882552.5547924-29034-163182331614317 <<< 27885 1726882552.59001: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882552.59047: stdout chunk (state=3): >>><<< 27885 1726882552.59050: stderr chunk (state=3): >>><<< 27885 1726882552.59317: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882552.5547924-29034-163182331614317=/root/.ansible/tmp/ansible-tmp-1726882552.5547924-29034-163182331614317 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882552.59321: variable 'ansible_module_compression' from source: unknown 27885 1726882552.59323: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27885_1t3g5hz/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 27885 1726882552.59460: variable 'ansible_facts' from source: unknown 27885 1726882552.59931: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882552.5547924-29034-163182331614317/AnsiballZ_systemd.py 27885 1726882552.60321: Sending initial data 27885 1726882552.60330: Sent initial data (156 bytes) 27885 1726882552.60960: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882552.60965: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882552.60972: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882552.60974: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882552.61010: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882552.61232: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882552.62622: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27885 1726882552.62692: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27885 1726882552.62772: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpeqzcqjoc /root/.ansible/tmp/ansible-tmp-1726882552.5547924-29034-163182331614317/AnsiballZ_systemd.py <<< 27885 1726882552.62776: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882552.5547924-29034-163182331614317/AnsiballZ_systemd.py" <<< 27885 1726882552.62835: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpeqzcqjoc" to remote "/root/.ansible/tmp/ansible-tmp-1726882552.5547924-29034-163182331614317/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882552.5547924-29034-163182331614317/AnsiballZ_systemd.py" <<< 27885 1726882552.65089: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882552.65095: stdout chunk (state=3): >>><<< 27885 1726882552.65098: stderr chunk (state=3): >>><<< 27885 1726882552.65100: done transferring module to remote 27885 1726882552.65102: _low_level_execute_command(): starting 27885 1726882552.65105: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882552.5547924-29034-163182331614317/ /root/.ansible/tmp/ansible-tmp-1726882552.5547924-29034-163182331614317/AnsiballZ_systemd.py && sleep 0' 27885 1726882552.65651: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882552.65658: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882552.65709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882552.65759: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882552.65772: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882552.65789: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882552.65884: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882552.67753: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882552.67757: stdout chunk (state=3): >>><<< 27885 1726882552.67759: stderr chunk (state=3): >>><<< 27885 1726882552.67762: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882552.67764: _low_level_execute_command(): starting 27885 1726882552.67767: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882552.5547924-29034-163182331614317/AnsiballZ_systemd.py && sleep 0' 27885 1726882552.68368: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882552.68382: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882552.68399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882552.68420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882552.68520: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882552.68554: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882552.68656: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882552.97184: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6947", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainStartTimestampMonotonic": "260736749", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainHandoffTimestampMonotonic": "260753620", "ExecMainPID": "6947", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4694016", "MemoryPeak": "7507968", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3304992768", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1360461000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "Coredum<<< 27885 1726882552.97203: stdout chunk (state=3): >>>pReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.target shutdown.target multi-user.target", "After": "basi<<< 27885 1726882552.97207: stdout chunk (state=3): >>>c.target cloud-init-local.service dbus-broker.service system.slice network-pre.target systemd-journald.socket sysinit.target dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:25 EDT", "StateChangeTimestampMonotonic": "355353338", "InactiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveExitTimestampMonotonic": "260738404", "ActiveEnterTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ActiveEnterTimestampMonotonic": "260824743", "ActiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ActiveExitTimestampMonotonic": "260719627", "InactiveEnterTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveEnterTimestampMonotonic": "260732561", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ConditionTimestampMonotonic": "260735742", "AssertTimestamp": "Fri 2024-09-20 21:27:50 EDT", "AssertTimestampMonotonic": "260735751", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "02f7cf7a90d5486687dc572c7e50e205", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 27885 1726882552.98924: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 27885 1726882552.98929: stdout chunk (state=3): >>><<< 27885 1726882552.98931: stderr chunk (state=3): >>><<< 27885 1726882552.98952: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6947", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainStartTimestampMonotonic": "260736749", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainHandoffTimestampMonotonic": "260753620", "ExecMainPID": "6947", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4694016", "MemoryPeak": "7507968", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3304992768", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1360461000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.target shutdown.target multi-user.target", "After": "basic.target cloud-init-local.service dbus-broker.service system.slice network-pre.target systemd-journald.socket sysinit.target dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:25 EDT", "StateChangeTimestampMonotonic": "355353338", "InactiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveExitTimestampMonotonic": "260738404", "ActiveEnterTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ActiveEnterTimestampMonotonic": "260824743", "ActiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ActiveExitTimestampMonotonic": "260719627", "InactiveEnterTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveEnterTimestampMonotonic": "260732561", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ConditionTimestampMonotonic": "260735742", "AssertTimestamp": "Fri 2024-09-20 21:27:50 EDT", "AssertTimestampMonotonic": "260735751", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "02f7cf7a90d5486687dc572c7e50e205", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 27885 1726882552.99273: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882552.5547924-29034-163182331614317/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27885 1726882552.99277: _low_level_execute_command(): starting 27885 1726882552.99279: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882552.5547924-29034-163182331614317/ > /dev/null 2>&1 && sleep 0' 27885 1726882552.99831: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882552.99839: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882552.99850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882552.99865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882552.99907: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882553.00017: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882553.00021: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882553.00023: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882553.00084: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882553.01923: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882553.01927: stdout chunk (state=3): >>><<< 27885 1726882553.01930: stderr chunk (state=3): >>><<< 27885 1726882553.01947: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882553.02097: handler run complete 27885 1726882553.02101: attempt loop complete, returning result 27885 1726882553.02103: _execute() done 27885 1726882553.02105: dumping result to json 27885 1726882553.02107: done dumping result, returning 27885 1726882553.02109: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-3fa5-01be-000000000078] 27885 1726882553.02111: sending task result for task 12673a56-9f93-3fa5-01be-000000000078 27885 1726882553.02499: done sending task result for task 12673a56-9f93-3fa5-01be-000000000078 27885 1726882553.02502: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 27885 1726882553.02559: no more pending results, returning what we have 27885 1726882553.02563: results queue empty 27885 1726882553.02564: checking for any_errors_fatal 27885 1726882553.02572: done checking for any_errors_fatal 27885 1726882553.02573: checking for max_fail_percentage 27885 1726882553.02575: done checking for max_fail_percentage 27885 1726882553.02576: checking to see if all hosts have failed and the running result is not ok 27885 1726882553.02577: done checking to see if all hosts have failed 27885 1726882553.02577: getting the remaining hosts for this loop 27885 1726882553.02579: done getting the remaining hosts for this loop 27885 1726882553.02584: getting the next task for host managed_node2 27885 1726882553.02590: done getting next task for host managed_node2 27885 1726882553.02597: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 27885 1726882553.02601: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882553.02612: getting variables 27885 1726882553.02613: in VariableManager get_vars() 27885 1726882553.02654: Calling all_inventory to load vars for managed_node2 27885 1726882553.02657: Calling groups_inventory to load vars for managed_node2 27885 1726882553.02660: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882553.02670: Calling all_plugins_play to load vars for managed_node2 27885 1726882553.02674: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882553.02677: Calling groups_plugins_play to load vars for managed_node2 27885 1726882553.04268: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882553.05823: done with get_vars() 27885 1726882553.05851: done getting variables 27885 1726882553.05912: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:35:53 -0400 (0:00:00.732) 0:00:25.702 ****** 27885 1726882553.05982: entering _queue_task() for managed_node2/service 27885 1726882553.06714: worker is 1 (out of 1 available) 27885 1726882553.06726: exiting _queue_task() for managed_node2/service 27885 1726882553.06738: done queuing things up, now waiting for results queue to drain 27885 1726882553.06739: waiting for pending results... 27885 1726882553.07243: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 27885 1726882553.07814: in run() - task 12673a56-9f93-3fa5-01be-000000000079 27885 1726882553.07819: variable 'ansible_search_path' from source: unknown 27885 1726882553.07821: variable 'ansible_search_path' from source: unknown 27885 1726882553.07824: calling self._execute() 27885 1726882553.07826: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882553.07828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882553.07901: variable 'omit' from source: magic vars 27885 1726882553.08335: variable 'ansible_distribution_major_version' from source: facts 27885 1726882553.08356: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882553.08487: variable 'network_provider' from source: set_fact 27885 1726882553.08499: Evaluated conditional (network_provider == "nm"): True 27885 1726882553.08597: variable '__network_wpa_supplicant_required' from source: role '' defaults 27885 1726882553.08690: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 27885 1726882553.08863: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27885 1726882553.13564: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27885 1726882553.13810: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27885 1726882553.13814: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27885 1726882553.13999: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27885 1726882553.14002: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27885 1726882553.14242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882553.14246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882553.14248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882553.14314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882553.14332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882553.14396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882553.14483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882553.14591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882553.14636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882553.14688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882553.14734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882553.14999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882553.15002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882553.15005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882553.15007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882553.15350: variable 'network_connections' from source: task vars 27885 1726882553.15367: variable 'interface1' from source: play vars 27885 1726882553.15517: variable 'interface1' from source: play vars 27885 1726882553.15661: variable 'interface1_mac' from source: set_fact 27885 1726882553.15752: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27885 1726882553.16300: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27885 1726882553.16303: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27885 1726882553.16335: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27885 1726882553.16438: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27885 1726882553.16486: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27885 1726882553.16541: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27885 1726882553.16653: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882553.16684: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27885 1726882553.16950: variable '__network_wireless_connections_defined' from source: role '' defaults 27885 1726882553.17260: variable 'network_connections' from source: task vars 27885 1726882553.17286: variable 'interface1' from source: play vars 27885 1726882553.17352: variable 'interface1' from source: play vars 27885 1726882553.17436: variable 'interface1_mac' from source: set_fact 27885 1726882553.17483: Evaluated conditional (__network_wpa_supplicant_required): False 27885 1726882553.17499: when evaluation is False, skipping this task 27885 1726882553.17507: _execute() done 27885 1726882553.17515: dumping result to json 27885 1726882553.17523: done dumping result, returning 27885 1726882553.17544: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-3fa5-01be-000000000079] 27885 1726882553.17554: sending task result for task 12673a56-9f93-3fa5-01be-000000000079 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 27885 1726882553.17703: no more pending results, returning what we have 27885 1726882553.17707: results queue empty 27885 1726882553.17709: checking for any_errors_fatal 27885 1726882553.17730: done checking for any_errors_fatal 27885 1726882553.17731: checking for max_fail_percentage 27885 1726882553.17733: done checking for max_fail_percentage 27885 1726882553.17734: checking to see if all hosts have failed and the running result is not ok 27885 1726882553.17735: done checking to see if all hosts have failed 27885 1726882553.17736: getting the remaining hosts for this loop 27885 1726882553.17738: done getting the remaining hosts for this loop 27885 1726882553.17742: getting the next task for host managed_node2 27885 1726882553.17750: done getting next task for host managed_node2 27885 1726882553.17754: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 27885 1726882553.17758: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882553.17776: getting variables 27885 1726882553.17778: in VariableManager get_vars() 27885 1726882553.17820: Calling all_inventory to load vars for managed_node2 27885 1726882553.17823: Calling groups_inventory to load vars for managed_node2 27885 1726882553.17825: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882553.17835: Calling all_plugins_play to load vars for managed_node2 27885 1726882553.17837: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882553.17839: Calling groups_plugins_play to load vars for managed_node2 27885 1726882553.18846: done sending task result for task 12673a56-9f93-3fa5-01be-000000000079 27885 1726882553.18849: WORKER PROCESS EXITING 27885 1726882553.19746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882553.21355: done with get_vars() 27885 1726882553.21384: done getting variables 27885 1726882553.21447: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:35:53 -0400 (0:00:00.155) 0:00:25.857 ****** 27885 1726882553.21485: entering _queue_task() for managed_node2/service 27885 1726882553.22027: worker is 1 (out of 1 available) 27885 1726882553.22036: exiting _queue_task() for managed_node2/service 27885 1726882553.22048: done queuing things up, now waiting for results queue to drain 27885 1726882553.22049: waiting for pending results... 27885 1726882553.22149: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 27885 1726882553.22297: in run() - task 12673a56-9f93-3fa5-01be-00000000007a 27885 1726882553.22318: variable 'ansible_search_path' from source: unknown 27885 1726882553.22325: variable 'ansible_search_path' from source: unknown 27885 1726882553.22363: calling self._execute() 27885 1726882553.22471: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882553.22496: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882553.22603: variable 'omit' from source: magic vars 27885 1726882553.23270: variable 'ansible_distribution_major_version' from source: facts 27885 1726882553.23312: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882553.23701: variable 'network_provider' from source: set_fact 27885 1726882553.23705: Evaluated conditional (network_provider == "initscripts"): False 27885 1726882553.23707: when evaluation is False, skipping this task 27885 1726882553.23709: _execute() done 27885 1726882553.23711: dumping result to json 27885 1726882553.23713: done dumping result, returning 27885 1726882553.23716: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-3fa5-01be-00000000007a] 27885 1726882553.23718: sending task result for task 12673a56-9f93-3fa5-01be-00000000007a 27885 1726882553.23801: done sending task result for task 12673a56-9f93-3fa5-01be-00000000007a skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 27885 1726882553.23850: no more pending results, returning what we have 27885 1726882553.23855: results queue empty 27885 1726882553.23856: checking for any_errors_fatal 27885 1726882553.23868: done checking for any_errors_fatal 27885 1726882553.23869: checking for max_fail_percentage 27885 1726882553.23870: done checking for max_fail_percentage 27885 1726882553.23871: checking to see if all hosts have failed and the running result is not ok 27885 1726882553.23872: done checking to see if all hosts have failed 27885 1726882553.23873: getting the remaining hosts for this loop 27885 1726882553.23875: done getting the remaining hosts for this loop 27885 1726882553.23879: getting the next task for host managed_node2 27885 1726882553.23886: done getting next task for host managed_node2 27885 1726882553.23890: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 27885 1726882553.23896: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882553.23918: getting variables 27885 1726882553.23920: in VariableManager get_vars() 27885 1726882553.23959: Calling all_inventory to load vars for managed_node2 27885 1726882553.23962: Calling groups_inventory to load vars for managed_node2 27885 1726882553.23964: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882553.23975: Calling all_plugins_play to load vars for managed_node2 27885 1726882553.23978: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882553.23980: Calling groups_plugins_play to load vars for managed_node2 27885 1726882553.24801: WORKER PROCESS EXITING 27885 1726882553.27024: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882553.28863: done with get_vars() 27885 1726882553.29119: done getting variables 27885 1726882553.29188: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:35:53 -0400 (0:00:00.077) 0:00:25.934 ****** 27885 1726882553.29232: entering _queue_task() for managed_node2/copy 27885 1726882553.29705: worker is 1 (out of 1 available) 27885 1726882553.29717: exiting _queue_task() for managed_node2/copy 27885 1726882553.29729: done queuing things up, now waiting for results queue to drain 27885 1726882553.29730: waiting for pending results... 27885 1726882553.29924: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 27885 1726882553.30073: in run() - task 12673a56-9f93-3fa5-01be-00000000007b 27885 1726882553.30095: variable 'ansible_search_path' from source: unknown 27885 1726882553.30104: variable 'ansible_search_path' from source: unknown 27885 1726882553.30146: calling self._execute() 27885 1726882553.30257: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882553.30368: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882553.30373: variable 'omit' from source: magic vars 27885 1726882553.30725: variable 'ansible_distribution_major_version' from source: facts 27885 1726882553.30743: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882553.30878: variable 'network_provider' from source: set_fact 27885 1726882553.30888: Evaluated conditional (network_provider == "initscripts"): False 27885 1726882553.30898: when evaluation is False, skipping this task 27885 1726882553.30906: _execute() done 27885 1726882553.30915: dumping result to json 27885 1726882553.30931: done dumping result, returning 27885 1726882553.30944: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-3fa5-01be-00000000007b] 27885 1726882553.30954: sending task result for task 12673a56-9f93-3fa5-01be-00000000007b skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 27885 1726882553.31197: no more pending results, returning what we have 27885 1726882553.31201: results queue empty 27885 1726882553.31203: checking for any_errors_fatal 27885 1726882553.31210: done checking for any_errors_fatal 27885 1726882553.31211: checking for max_fail_percentage 27885 1726882553.31212: done checking for max_fail_percentage 27885 1726882553.31213: checking to see if all hosts have failed and the running result is not ok 27885 1726882553.31214: done checking to see if all hosts have failed 27885 1726882553.31214: getting the remaining hosts for this loop 27885 1726882553.31217: done getting the remaining hosts for this loop 27885 1726882553.31221: getting the next task for host managed_node2 27885 1726882553.31229: done getting next task for host managed_node2 27885 1726882553.31234: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 27885 1726882553.31238: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882553.31261: getting variables 27885 1726882553.31263: in VariableManager get_vars() 27885 1726882553.31307: Calling all_inventory to load vars for managed_node2 27885 1726882553.31311: Calling groups_inventory to load vars for managed_node2 27885 1726882553.31313: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882553.31327: Calling all_plugins_play to load vars for managed_node2 27885 1726882553.31330: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882553.31333: Calling groups_plugins_play to load vars for managed_node2 27885 1726882553.32100: done sending task result for task 12673a56-9f93-3fa5-01be-00000000007b 27885 1726882553.32103: WORKER PROCESS EXITING 27885 1726882553.33417: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882553.35711: done with get_vars() 27885 1726882553.35737: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:35:53 -0400 (0:00:00.065) 0:00:26.000 ****** 27885 1726882553.35825: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 27885 1726882553.36170: worker is 1 (out of 1 available) 27885 1726882553.36183: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 27885 1726882553.36399: done queuing things up, now waiting for results queue to drain 27885 1726882553.36401: waiting for pending results... 27885 1726882553.36484: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 27885 1726882553.36632: in run() - task 12673a56-9f93-3fa5-01be-00000000007c 27885 1726882553.36667: variable 'ansible_search_path' from source: unknown 27885 1726882553.36679: variable 'ansible_search_path' from source: unknown 27885 1726882553.36731: calling self._execute() 27885 1726882553.36864: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882553.36875: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882553.36889: variable 'omit' from source: magic vars 27885 1726882553.37258: variable 'ansible_distribution_major_version' from source: facts 27885 1726882553.37274: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882553.37290: variable 'omit' from source: magic vars 27885 1726882553.37353: variable 'omit' from source: magic vars 27885 1726882553.37522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27885 1726882553.40167: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27885 1726882553.40240: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27885 1726882553.40284: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27885 1726882553.40328: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27885 1726882553.40357: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27885 1726882553.40508: variable 'network_provider' from source: set_fact 27885 1726882553.40574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882553.40921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882553.40955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882553.41002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882553.41022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882553.41108: variable 'omit' from source: magic vars 27885 1726882553.41233: variable 'omit' from source: magic vars 27885 1726882553.41433: variable 'network_connections' from source: task vars 27885 1726882553.41450: variable 'interface1' from source: play vars 27885 1726882553.41524: variable 'interface1' from source: play vars 27885 1726882553.41608: variable 'interface1_mac' from source: set_fact 27885 1726882553.41816: variable 'omit' from source: magic vars 27885 1726882553.41819: variable '__lsr_ansible_managed' from source: task vars 27885 1726882553.41878: variable '__lsr_ansible_managed' from source: task vars 27885 1726882553.42163: Loaded config def from plugin (lookup/template) 27885 1726882553.42173: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 27885 1726882553.42206: File lookup term: get_ansible_managed.j2 27885 1726882553.42214: variable 'ansible_search_path' from source: unknown 27885 1726882553.42223: evaluation_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 27885 1726882553.42238: search_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 27885 1726882553.42263: variable 'ansible_search_path' from source: unknown 27885 1726882553.48296: variable 'ansible_managed' from source: unknown 27885 1726882553.48374: variable 'omit' from source: magic vars 27885 1726882553.48410: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882553.48443: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882553.48466: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882553.48490: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882553.48513: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882553.48547: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882553.48617: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882553.48620: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882553.48667: Set connection var ansible_pipelining to False 27885 1726882553.48679: Set connection var ansible_connection to ssh 27885 1726882553.48689: Set connection var ansible_timeout to 10 27885 1726882553.48699: Set connection var ansible_shell_type to sh 27885 1726882553.48710: Set connection var ansible_shell_executable to /bin/sh 27885 1726882553.48722: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882553.48749: variable 'ansible_shell_executable' from source: unknown 27885 1726882553.48755: variable 'ansible_connection' from source: unknown 27885 1726882553.48761: variable 'ansible_module_compression' from source: unknown 27885 1726882553.48765: variable 'ansible_shell_type' from source: unknown 27885 1726882553.48770: variable 'ansible_shell_executable' from source: unknown 27885 1726882553.48775: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882553.48836: variable 'ansible_pipelining' from source: unknown 27885 1726882553.48839: variable 'ansible_timeout' from source: unknown 27885 1726882553.48841: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882553.48916: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 27885 1726882553.48942: variable 'omit' from source: magic vars 27885 1726882553.49059: starting attempt loop 27885 1726882553.49062: running the handler 27885 1726882553.49065: _low_level_execute_command(): starting 27885 1726882553.49067: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27885 1726882553.49810: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882553.49871: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882553.49891: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882553.49922: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882553.50028: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882553.51680: stdout chunk (state=3): >>>/root <<< 27885 1726882553.51841: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882553.51844: stdout chunk (state=3): >>><<< 27885 1726882553.51846: stderr chunk (state=3): >>><<< 27885 1726882553.51863: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882553.51880: _low_level_execute_command(): starting 27885 1726882553.51961: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882553.518697-29082-51190619622960 `" && echo ansible-tmp-1726882553.518697-29082-51190619622960="` echo /root/.ansible/tmp/ansible-tmp-1726882553.518697-29082-51190619622960 `" ) && sleep 0' 27885 1726882553.52519: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882553.52538: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882553.52555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882553.52572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882553.52665: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882553.52696: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882553.52714: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882553.52736: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882553.52831: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882553.54710: stdout chunk (state=3): >>>ansible-tmp-1726882553.518697-29082-51190619622960=/root/.ansible/tmp/ansible-tmp-1726882553.518697-29082-51190619622960 <<< 27885 1726882553.54857: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882553.54866: stdout chunk (state=3): >>><<< 27885 1726882553.54881: stderr chunk (state=3): >>><<< 27885 1726882553.54999: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882553.518697-29082-51190619622960=/root/.ansible/tmp/ansible-tmp-1726882553.518697-29082-51190619622960 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882553.55002: variable 'ansible_module_compression' from source: unknown 27885 1726882553.55010: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27885_1t3g5hz/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 27885 1726882553.55046: variable 'ansible_facts' from source: unknown 27885 1726882553.55172: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882553.518697-29082-51190619622960/AnsiballZ_network_connections.py 27885 1726882553.55357: Sending initial data 27885 1726882553.55360: Sent initial data (166 bytes) 27885 1726882553.55967: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882553.55999: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882553.56107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882553.56150: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882553.56206: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882553.57762: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27885 1726882553.57826: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27885 1726882553.57930: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpip7kiiwi /root/.ansible/tmp/ansible-tmp-1726882553.518697-29082-51190619622960/AnsiballZ_network_connections.py <<< 27885 1726882553.57933: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882553.518697-29082-51190619622960/AnsiballZ_network_connections.py" <<< 27885 1726882553.58025: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpip7kiiwi" to remote "/root/.ansible/tmp/ansible-tmp-1726882553.518697-29082-51190619622960/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882553.518697-29082-51190619622960/AnsiballZ_network_connections.py" <<< 27885 1726882553.59213: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882553.59251: stderr chunk (state=3): >>><<< 27885 1726882553.59260: stdout chunk (state=3): >>><<< 27885 1726882553.59348: done transferring module to remote 27885 1726882553.59351: _low_level_execute_command(): starting 27885 1726882553.59353: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882553.518697-29082-51190619622960/ /root/.ansible/tmp/ansible-tmp-1726882553.518697-29082-51190619622960/AnsiballZ_network_connections.py && sleep 0' 27885 1726882553.60220: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882553.60310: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882553.60350: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882553.60364: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882553.60385: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882553.60506: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882553.62208: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882553.62252: stderr chunk (state=3): >>><<< 27885 1726882553.62265: stdout chunk (state=3): >>><<< 27885 1726882553.62282: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882553.62295: _low_level_execute_command(): starting 27885 1726882553.62368: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882553.518697-29082-51190619622960/AnsiballZ_network_connections.py && sleep 0' 27885 1726882553.63231: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882553.63234: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882553.63236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882553.63252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882553.63267: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882553.63276: stderr chunk (state=3): >>>debug2: match not found <<< 27885 1726882553.63525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882553.63549: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882553.63645: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882553.91274: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'ethtest1': update connection ethtest1, 9dc4eccb-4733-427c-9f5b-05ec76f599ff\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest1", "mac": "ca:0d:d0:ac:e2:a3", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.4/24", "2001:db8::6/32"], "route": [{"network": "198.58.10.64", "prefix": 26, "gateway": "198.51.100.102", "metric": 4}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest1", "mac": "ca:0d:d0:ac:e2:a3", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.4/24", "2001:db8::6/32"], "route": [{"network": "198.58.10.64", "prefix": 26, "gateway": "198.51.100.102", "metric": 4}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 27885 1726882553.92907: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 27885 1726882553.92935: stderr chunk (state=3): >>><<< 27885 1726882553.92938: stdout chunk (state=3): >>><<< 27885 1726882553.92955: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'ethtest1': update connection ethtest1, 9dc4eccb-4733-427c-9f5b-05ec76f599ff\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest1", "mac": "ca:0d:d0:ac:e2:a3", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.4/24", "2001:db8::6/32"], "route": [{"network": "198.58.10.64", "prefix": 26, "gateway": "198.51.100.102", "metric": 4}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest1", "mac": "ca:0d:d0:ac:e2:a3", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.4/24", "2001:db8::6/32"], "route": [{"network": "198.58.10.64", "prefix": 26, "gateway": "198.51.100.102", "metric": 4}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 27885 1726882553.92995: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest1', 'mac': 'ca:0d:d0:ac:e2:a3', 'type': 'ethernet', 'autoconnect': False, 'ip': {'address': ['198.51.100.4/24', '2001:db8::6/32'], 'route': [{'network': '198.58.10.64', 'prefix': 26, 'gateway': '198.51.100.102', 'metric': 4}]}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882553.518697-29082-51190619622960/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27885 1726882553.93002: _low_level_execute_command(): starting 27885 1726882553.93007: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882553.518697-29082-51190619622960/ > /dev/null 2>&1 && sleep 0' 27885 1726882553.93466: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882553.93472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882553.93474: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 27885 1726882553.93476: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882553.93478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882553.93533: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882553.93540: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882553.93542: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882553.93609: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882553.95430: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882553.95454: stderr chunk (state=3): >>><<< 27885 1726882553.95457: stdout chunk (state=3): >>><<< 27885 1726882553.95470: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882553.95476: handler run complete 27885 1726882553.95505: attempt loop complete, returning result 27885 1726882553.95510: _execute() done 27885 1726882553.95512: dumping result to json 27885 1726882553.95518: done dumping result, returning 27885 1726882553.95527: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-3fa5-01be-00000000007c] 27885 1726882553.95533: sending task result for task 12673a56-9f93-3fa5-01be-00000000007c 27885 1726882553.95640: done sending task result for task 12673a56-9f93-3fa5-01be-00000000007c 27885 1726882553.95642: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "address": [ "198.51.100.4/24", "2001:db8::6/32" ], "route": [ { "gateway": "198.51.100.102", "metric": 4, "network": "198.58.10.64", "prefix": 26 } ] }, "mac": "ca:0d:d0:ac:e2:a3", "name": "ethtest1", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'ethtest1': update connection ethtest1, 9dc4eccb-4733-427c-9f5b-05ec76f599ff 27885 1726882553.95765: no more pending results, returning what we have 27885 1726882553.95768: results queue empty 27885 1726882553.95769: checking for any_errors_fatal 27885 1726882553.95778: done checking for any_errors_fatal 27885 1726882553.95779: checking for max_fail_percentage 27885 1726882553.95780: done checking for max_fail_percentage 27885 1726882553.95781: checking to see if all hosts have failed and the running result is not ok 27885 1726882553.95782: done checking to see if all hosts have failed 27885 1726882553.95782: getting the remaining hosts for this loop 27885 1726882553.95784: done getting the remaining hosts for this loop 27885 1726882553.95787: getting the next task for host managed_node2 27885 1726882553.95796: done getting next task for host managed_node2 27885 1726882553.95800: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 27885 1726882553.95803: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882553.95813: getting variables 27885 1726882553.95814: in VariableManager get_vars() 27885 1726882553.95849: Calling all_inventory to load vars for managed_node2 27885 1726882553.95852: Calling groups_inventory to load vars for managed_node2 27885 1726882553.95854: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882553.95862: Calling all_plugins_play to load vars for managed_node2 27885 1726882553.95865: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882553.95867: Calling groups_plugins_play to load vars for managed_node2 27885 1726882553.96795: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882553.97662: done with get_vars() 27885 1726882553.97677: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:35:53 -0400 (0:00:00.619) 0:00:26.619 ****** 27885 1726882553.97742: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 27885 1726882553.97973: worker is 1 (out of 1 available) 27885 1726882553.97985: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 27885 1726882553.98001: done queuing things up, now waiting for results queue to drain 27885 1726882553.98003: waiting for pending results... 27885 1726882553.98172: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 27885 1726882553.98264: in run() - task 12673a56-9f93-3fa5-01be-00000000007d 27885 1726882553.98276: variable 'ansible_search_path' from source: unknown 27885 1726882553.98279: variable 'ansible_search_path' from source: unknown 27885 1726882553.98309: calling self._execute() 27885 1726882553.98382: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882553.98386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882553.98397: variable 'omit' from source: magic vars 27885 1726882553.98667: variable 'ansible_distribution_major_version' from source: facts 27885 1726882553.98671: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882553.98753: variable 'network_state' from source: role '' defaults 27885 1726882553.98761: Evaluated conditional (network_state != {}): False 27885 1726882553.98764: when evaluation is False, skipping this task 27885 1726882553.98768: _execute() done 27885 1726882553.98772: dumping result to json 27885 1726882553.98775: done dumping result, returning 27885 1726882553.98778: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-3fa5-01be-00000000007d] 27885 1726882553.98794: sending task result for task 12673a56-9f93-3fa5-01be-00000000007d 27885 1726882553.98865: done sending task result for task 12673a56-9f93-3fa5-01be-00000000007d 27885 1726882553.98868: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 27885 1726882553.98939: no more pending results, returning what we have 27885 1726882553.98943: results queue empty 27885 1726882553.98944: checking for any_errors_fatal 27885 1726882553.98955: done checking for any_errors_fatal 27885 1726882553.98956: checking for max_fail_percentage 27885 1726882553.98958: done checking for max_fail_percentage 27885 1726882553.98958: checking to see if all hosts have failed and the running result is not ok 27885 1726882553.98959: done checking to see if all hosts have failed 27885 1726882553.98960: getting the remaining hosts for this loop 27885 1726882553.98961: done getting the remaining hosts for this loop 27885 1726882553.98964: getting the next task for host managed_node2 27885 1726882553.98970: done getting next task for host managed_node2 27885 1726882553.98972: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 27885 1726882553.98975: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882553.98992: getting variables 27885 1726882553.98996: in VariableManager get_vars() 27885 1726882553.99026: Calling all_inventory to load vars for managed_node2 27885 1726882553.99029: Calling groups_inventory to load vars for managed_node2 27885 1726882553.99031: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882553.99039: Calling all_plugins_play to load vars for managed_node2 27885 1726882553.99041: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882553.99043: Calling groups_plugins_play to load vars for managed_node2 27885 1726882553.99815: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882554.00964: done with get_vars() 27885 1726882554.00986: done getting variables 27885 1726882554.01050: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:35:54 -0400 (0:00:00.033) 0:00:26.653 ****** 27885 1726882554.01079: entering _queue_task() for managed_node2/debug 27885 1726882554.01554: worker is 1 (out of 1 available) 27885 1726882554.01566: exiting _queue_task() for managed_node2/debug 27885 1726882554.01602: done queuing things up, now waiting for results queue to drain 27885 1726882554.01604: waiting for pending results... 27885 1726882554.01979: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 27885 1726882554.02169: in run() - task 12673a56-9f93-3fa5-01be-00000000007e 27885 1726882554.02173: variable 'ansible_search_path' from source: unknown 27885 1726882554.02176: variable 'ansible_search_path' from source: unknown 27885 1726882554.02257: calling self._execute() 27885 1726882554.02340: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882554.02345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882554.02355: variable 'omit' from source: magic vars 27885 1726882554.02642: variable 'ansible_distribution_major_version' from source: facts 27885 1726882554.02650: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882554.02658: variable 'omit' from source: magic vars 27885 1726882554.02704: variable 'omit' from source: magic vars 27885 1726882554.02729: variable 'omit' from source: magic vars 27885 1726882554.02759: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882554.02788: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882554.02809: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882554.02823: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882554.02832: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882554.02855: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882554.02858: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882554.02861: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882554.02936: Set connection var ansible_pipelining to False 27885 1726882554.02939: Set connection var ansible_connection to ssh 27885 1726882554.02944: Set connection var ansible_timeout to 10 27885 1726882554.02947: Set connection var ansible_shell_type to sh 27885 1726882554.02952: Set connection var ansible_shell_executable to /bin/sh 27885 1726882554.02957: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882554.02975: variable 'ansible_shell_executable' from source: unknown 27885 1726882554.02978: variable 'ansible_connection' from source: unknown 27885 1726882554.02981: variable 'ansible_module_compression' from source: unknown 27885 1726882554.02983: variable 'ansible_shell_type' from source: unknown 27885 1726882554.02987: variable 'ansible_shell_executable' from source: unknown 27885 1726882554.02990: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882554.02992: variable 'ansible_pipelining' from source: unknown 27885 1726882554.03001: variable 'ansible_timeout' from source: unknown 27885 1726882554.03003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882554.03098: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882554.03110: variable 'omit' from source: magic vars 27885 1726882554.03113: starting attempt loop 27885 1726882554.03115: running the handler 27885 1726882554.03206: variable '__network_connections_result' from source: set_fact 27885 1726882554.03250: handler run complete 27885 1726882554.03262: attempt loop complete, returning result 27885 1726882554.03265: _execute() done 27885 1726882554.03302: dumping result to json 27885 1726882554.03305: done dumping result, returning 27885 1726882554.03308: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-3fa5-01be-00000000007e] 27885 1726882554.03354: sending task result for task 12673a56-9f93-3fa5-01be-00000000007e 27885 1726882554.03634: done sending task result for task 12673a56-9f93-3fa5-01be-00000000007e 27885 1726882554.03637: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'ethtest1': update connection ethtest1, 9dc4eccb-4733-427c-9f5b-05ec76f599ff" ] } 27885 1726882554.03771: no more pending results, returning what we have 27885 1726882554.03774: results queue empty 27885 1726882554.03775: checking for any_errors_fatal 27885 1726882554.03779: done checking for any_errors_fatal 27885 1726882554.03780: checking for max_fail_percentage 27885 1726882554.03781: done checking for max_fail_percentage 27885 1726882554.03782: checking to see if all hosts have failed and the running result is not ok 27885 1726882554.03783: done checking to see if all hosts have failed 27885 1726882554.03783: getting the remaining hosts for this loop 27885 1726882554.03785: done getting the remaining hosts for this loop 27885 1726882554.03788: getting the next task for host managed_node2 27885 1726882554.03795: done getting next task for host managed_node2 27885 1726882554.03798: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 27885 1726882554.03801: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882554.03811: getting variables 27885 1726882554.03812: in VariableManager get_vars() 27885 1726882554.03844: Calling all_inventory to load vars for managed_node2 27885 1726882554.03847: Calling groups_inventory to load vars for managed_node2 27885 1726882554.03848: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882554.03856: Calling all_plugins_play to load vars for managed_node2 27885 1726882554.03858: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882554.03861: Calling groups_plugins_play to load vars for managed_node2 27885 1726882554.05446: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882554.07052: done with get_vars() 27885 1726882554.07075: done getting variables 27885 1726882554.07137: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:35:54 -0400 (0:00:00.060) 0:00:26.714 ****** 27885 1726882554.07169: entering _queue_task() for managed_node2/debug 27885 1726882554.07502: worker is 1 (out of 1 available) 27885 1726882554.07514: exiting _queue_task() for managed_node2/debug 27885 1726882554.07527: done queuing things up, now waiting for results queue to drain 27885 1726882554.07528: waiting for pending results... 27885 1726882554.07910: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 27885 1726882554.07958: in run() - task 12673a56-9f93-3fa5-01be-00000000007f 27885 1726882554.07978: variable 'ansible_search_path' from source: unknown 27885 1726882554.07985: variable 'ansible_search_path' from source: unknown 27885 1726882554.08034: calling self._execute() 27885 1726882554.08137: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882554.08245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882554.08249: variable 'omit' from source: magic vars 27885 1726882554.08545: variable 'ansible_distribution_major_version' from source: facts 27885 1726882554.08560: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882554.08575: variable 'omit' from source: magic vars 27885 1726882554.08639: variable 'omit' from source: magic vars 27885 1726882554.08682: variable 'omit' from source: magic vars 27885 1726882554.08730: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882554.08767: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882554.08799: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882554.08822: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882554.08838: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882554.08869: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882554.08877: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882554.08884: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882554.08995: Set connection var ansible_pipelining to False 27885 1726882554.09010: Set connection var ansible_connection to ssh 27885 1726882554.09021: Set connection var ansible_timeout to 10 27885 1726882554.09027: Set connection var ansible_shell_type to sh 27885 1726882554.09036: Set connection var ansible_shell_executable to /bin/sh 27885 1726882554.09115: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882554.09118: variable 'ansible_shell_executable' from source: unknown 27885 1726882554.09120: variable 'ansible_connection' from source: unknown 27885 1726882554.09123: variable 'ansible_module_compression' from source: unknown 27885 1726882554.09125: variable 'ansible_shell_type' from source: unknown 27885 1726882554.09127: variable 'ansible_shell_executable' from source: unknown 27885 1726882554.09129: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882554.09131: variable 'ansible_pipelining' from source: unknown 27885 1726882554.09132: variable 'ansible_timeout' from source: unknown 27885 1726882554.09135: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882554.09254: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882554.09270: variable 'omit' from source: magic vars 27885 1726882554.09280: starting attempt loop 27885 1726882554.09291: running the handler 27885 1726882554.09348: variable '__network_connections_result' from source: set_fact 27885 1726882554.09429: variable '__network_connections_result' from source: set_fact 27885 1726882554.09575: handler run complete 27885 1726882554.09615: attempt loop complete, returning result 27885 1726882554.09658: _execute() done 27885 1726882554.09661: dumping result to json 27885 1726882554.09663: done dumping result, returning 27885 1726882554.09665: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-3fa5-01be-00000000007f] 27885 1726882554.09667: sending task result for task 12673a56-9f93-3fa5-01be-00000000007f ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "address": [ "198.51.100.4/24", "2001:db8::6/32" ], "route": [ { "gateway": "198.51.100.102", "metric": 4, "network": "198.58.10.64", "prefix": 26 } ] }, "mac": "ca:0d:d0:ac:e2:a3", "name": "ethtest1", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'ethtest1': update connection ethtest1, 9dc4eccb-4733-427c-9f5b-05ec76f599ff\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'ethtest1': update connection ethtest1, 9dc4eccb-4733-427c-9f5b-05ec76f599ff" ] } } 27885 1726882554.09863: no more pending results, returning what we have 27885 1726882554.09866: results queue empty 27885 1726882554.09867: checking for any_errors_fatal 27885 1726882554.09875: done checking for any_errors_fatal 27885 1726882554.09876: checking for max_fail_percentage 27885 1726882554.09878: done checking for max_fail_percentage 27885 1726882554.09879: checking to see if all hosts have failed and the running result is not ok 27885 1726882554.09880: done checking to see if all hosts have failed 27885 1726882554.09881: getting the remaining hosts for this loop 27885 1726882554.09882: done getting the remaining hosts for this loop 27885 1726882554.09886: getting the next task for host managed_node2 27885 1726882554.09897: done getting next task for host managed_node2 27885 1726882554.09900: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 27885 1726882554.09904: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882554.09916: getting variables 27885 1726882554.09918: in VariableManager get_vars() 27885 1726882554.09954: Calling all_inventory to load vars for managed_node2 27885 1726882554.09956: Calling groups_inventory to load vars for managed_node2 27885 1726882554.09959: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882554.09975: Calling all_plugins_play to load vars for managed_node2 27885 1726882554.09978: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882554.09981: Calling groups_plugins_play to load vars for managed_node2 27885 1726882554.10901: done sending task result for task 12673a56-9f93-3fa5-01be-00000000007f 27885 1726882554.10905: WORKER PROCESS EXITING 27885 1726882554.12642: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882554.16168: done with get_vars() 27885 1726882554.16194: done getting variables 27885 1726882554.16252: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:35:54 -0400 (0:00:00.091) 0:00:26.805 ****** 27885 1726882554.16287: entering _queue_task() for managed_node2/debug 27885 1726882554.16614: worker is 1 (out of 1 available) 27885 1726882554.16626: exiting _queue_task() for managed_node2/debug 27885 1726882554.16637: done queuing things up, now waiting for results queue to drain 27885 1726882554.16638: waiting for pending results... 27885 1726882554.16918: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 27885 1726882554.17063: in run() - task 12673a56-9f93-3fa5-01be-000000000080 27885 1726882554.17082: variable 'ansible_search_path' from source: unknown 27885 1726882554.17092: variable 'ansible_search_path' from source: unknown 27885 1726882554.17137: calling self._execute() 27885 1726882554.17240: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882554.17251: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882554.17263: variable 'omit' from source: magic vars 27885 1726882554.17638: variable 'ansible_distribution_major_version' from source: facts 27885 1726882554.17654: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882554.17785: variable 'network_state' from source: role '' defaults 27885 1726882554.17803: Evaluated conditional (network_state != {}): False 27885 1726882554.17811: when evaluation is False, skipping this task 27885 1726882554.17818: _execute() done 27885 1726882554.17825: dumping result to json 27885 1726882554.17831: done dumping result, returning 27885 1726882554.17842: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-3fa5-01be-000000000080] 27885 1726882554.17851: sending task result for task 12673a56-9f93-3fa5-01be-000000000080 skipping: [managed_node2] => { "false_condition": "network_state != {}" } 27885 1726882554.18032: no more pending results, returning what we have 27885 1726882554.18036: results queue empty 27885 1726882554.18037: checking for any_errors_fatal 27885 1726882554.18047: done checking for any_errors_fatal 27885 1726882554.18048: checking for max_fail_percentage 27885 1726882554.18049: done checking for max_fail_percentage 27885 1726882554.18050: checking to see if all hosts have failed and the running result is not ok 27885 1726882554.18051: done checking to see if all hosts have failed 27885 1726882554.18051: getting the remaining hosts for this loop 27885 1726882554.18053: done getting the remaining hosts for this loop 27885 1726882554.18057: getting the next task for host managed_node2 27885 1726882554.18064: done getting next task for host managed_node2 27885 1726882554.18068: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 27885 1726882554.18072: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882554.18096: getting variables 27885 1726882554.18098: in VariableManager get_vars() 27885 1726882554.18136: Calling all_inventory to load vars for managed_node2 27885 1726882554.18139: Calling groups_inventory to load vars for managed_node2 27885 1726882554.18141: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882554.18153: Calling all_plugins_play to load vars for managed_node2 27885 1726882554.18156: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882554.18159: Calling groups_plugins_play to load vars for managed_node2 27885 1726882554.18906: done sending task result for task 12673a56-9f93-3fa5-01be-000000000080 27885 1726882554.18910: WORKER PROCESS EXITING 27885 1726882554.19720: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882554.22575: done with get_vars() 27885 1726882554.22751: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:35:54 -0400 (0:00:00.065) 0:00:26.871 ****** 27885 1726882554.22888: entering _queue_task() for managed_node2/ping 27885 1726882554.23808: worker is 1 (out of 1 available) 27885 1726882554.23822: exiting _queue_task() for managed_node2/ping 27885 1726882554.23835: done queuing things up, now waiting for results queue to drain 27885 1726882554.23836: waiting for pending results... 27885 1726882554.24114: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 27885 1726882554.24258: in run() - task 12673a56-9f93-3fa5-01be-000000000081 27885 1726882554.24335: variable 'ansible_search_path' from source: unknown 27885 1726882554.24345: variable 'ansible_search_path' from source: unknown 27885 1726882554.24387: calling self._execute() 27885 1726882554.24507: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882554.24525: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882554.24541: variable 'omit' from source: magic vars 27885 1726882554.24926: variable 'ansible_distribution_major_version' from source: facts 27885 1726882554.24942: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882554.24957: variable 'omit' from source: magic vars 27885 1726882554.25022: variable 'omit' from source: magic vars 27885 1726882554.25066: variable 'omit' from source: magic vars 27885 1726882554.25114: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882554.25154: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882554.25225: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882554.25250: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882554.25269: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882554.25308: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882554.25316: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882554.25323: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882554.25512: Set connection var ansible_pipelining to False 27885 1726882554.25522: Set connection var ansible_connection to ssh 27885 1726882554.25532: Set connection var ansible_timeout to 10 27885 1726882554.25537: Set connection var ansible_shell_type to sh 27885 1726882554.25546: Set connection var ansible_shell_executable to /bin/sh 27885 1726882554.25555: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882554.25584: variable 'ansible_shell_executable' from source: unknown 27885 1726882554.25596: variable 'ansible_connection' from source: unknown 27885 1726882554.25608: variable 'ansible_module_compression' from source: unknown 27885 1726882554.25615: variable 'ansible_shell_type' from source: unknown 27885 1726882554.25623: variable 'ansible_shell_executable' from source: unknown 27885 1726882554.25629: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882554.25635: variable 'ansible_pipelining' from source: unknown 27885 1726882554.25641: variable 'ansible_timeout' from source: unknown 27885 1726882554.25648: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882554.25853: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 27885 1726882554.25870: variable 'omit' from source: magic vars 27885 1726882554.25879: starting attempt loop 27885 1726882554.25885: running the handler 27885 1726882554.25913: _low_level_execute_command(): starting 27885 1726882554.25925: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27885 1726882554.26622: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882554.26678: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882554.26755: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882554.26797: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882554.26800: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882554.26903: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882554.28548: stdout chunk (state=3): >>>/root <<< 27885 1726882554.28717: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882554.28720: stdout chunk (state=3): >>><<< 27885 1726882554.28723: stderr chunk (state=3): >>><<< 27885 1726882554.28758: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882554.28798: _low_level_execute_command(): starting 27885 1726882554.28803: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882554.2876487-29128-141217308476031 `" && echo ansible-tmp-1726882554.2876487-29128-141217308476031="` echo /root/.ansible/tmp/ansible-tmp-1726882554.2876487-29128-141217308476031 `" ) && sleep 0' 27885 1726882554.29353: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882554.29363: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882554.29399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882554.29403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882554.29405: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882554.29408: stderr chunk (state=3): >>>debug2: match not found <<< 27885 1726882554.29533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882554.29538: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27885 1726882554.29541: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 27885 1726882554.29543: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27885 1726882554.29545: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882554.29547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882554.29549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882554.29551: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882554.29553: stderr chunk (state=3): >>>debug2: match found <<< 27885 1726882554.29555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882554.29562: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882554.29572: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882554.29595: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882554.29680: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882554.31799: stdout chunk (state=3): >>>ansible-tmp-1726882554.2876487-29128-141217308476031=/root/.ansible/tmp/ansible-tmp-1726882554.2876487-29128-141217308476031 <<< 27885 1726882554.31802: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882554.31805: stdout chunk (state=3): >>><<< 27885 1726882554.31807: stderr chunk (state=3): >>><<< 27885 1726882554.31809: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882554.2876487-29128-141217308476031=/root/.ansible/tmp/ansible-tmp-1726882554.2876487-29128-141217308476031 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882554.31850: variable 'ansible_module_compression' from source: unknown 27885 1726882554.31894: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27885_1t3g5hz/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 27885 1726882554.31930: variable 'ansible_facts' from source: unknown 27885 1726882554.32312: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882554.2876487-29128-141217308476031/AnsiballZ_ping.py 27885 1726882554.32600: Sending initial data 27885 1726882554.32603: Sent initial data (153 bytes) 27885 1726882554.33346: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882554.33352: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882554.33364: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882554.33471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882554.33479: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882554.33497: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882554.33581: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882554.35116: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 27885 1726882554.35131: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 27885 1726882554.35171: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 27885 1726882554.35297: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27885 1726882554.35316: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27885 1726882554.35501: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpinq10t2v /root/.ansible/tmp/ansible-tmp-1726882554.2876487-29128-141217308476031/AnsiballZ_ping.py <<< 27885 1726882554.35504: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882554.2876487-29128-141217308476031/AnsiballZ_ping.py" <<< 27885 1726882554.35508: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpinq10t2v" to remote "/root/.ansible/tmp/ansible-tmp-1726882554.2876487-29128-141217308476031/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882554.2876487-29128-141217308476031/AnsiballZ_ping.py" <<< 27885 1726882554.36511: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882554.36569: stderr chunk (state=3): >>><<< 27885 1726882554.36583: stdout chunk (state=3): >>><<< 27885 1726882554.36638: done transferring module to remote 27885 1726882554.36653: _low_level_execute_command(): starting 27885 1726882554.36662: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882554.2876487-29128-141217308476031/ /root/.ansible/tmp/ansible-tmp-1726882554.2876487-29128-141217308476031/AnsiballZ_ping.py && sleep 0' 27885 1726882554.37300: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882554.37416: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882554.37441: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882554.37455: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882554.37543: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882554.39475: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882554.39479: stdout chunk (state=3): >>><<< 27885 1726882554.39481: stderr chunk (state=3): >>><<< 27885 1726882554.39547: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882554.39550: _low_level_execute_command(): starting 27885 1726882554.39559: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882554.2876487-29128-141217308476031/AnsiballZ_ping.py && sleep 0' 27885 1726882554.40186: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882554.40207: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882554.40222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882554.40244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882554.40282: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882554.40383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882554.40405: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882554.40520: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882554.55291: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 27885 1726882554.56464: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 27885 1726882554.56488: stderr chunk (state=3): >>><<< 27885 1726882554.56491: stdout chunk (state=3): >>><<< 27885 1726882554.56515: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 27885 1726882554.56537: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882554.2876487-29128-141217308476031/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27885 1726882554.56545: _low_level_execute_command(): starting 27885 1726882554.56550: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882554.2876487-29128-141217308476031/ > /dev/null 2>&1 && sleep 0' 27885 1726882554.56960: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882554.57000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 27885 1726882554.57003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882554.57005: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882554.57007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882554.57010: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882554.57052: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882554.57056: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882554.57060: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882554.57122: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882554.58912: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882554.58940: stderr chunk (state=3): >>><<< 27885 1726882554.58948: stdout chunk (state=3): >>><<< 27885 1726882554.58959: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882554.58964: handler run complete 27885 1726882554.58977: attempt loop complete, returning result 27885 1726882554.58980: _execute() done 27885 1726882554.58982: dumping result to json 27885 1726882554.58984: done dumping result, returning 27885 1726882554.58996: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-3fa5-01be-000000000081] 27885 1726882554.59001: sending task result for task 12673a56-9f93-3fa5-01be-000000000081 27885 1726882554.59083: done sending task result for task 12673a56-9f93-3fa5-01be-000000000081 27885 1726882554.59085: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 27885 1726882554.59148: no more pending results, returning what we have 27885 1726882554.59152: results queue empty 27885 1726882554.59153: checking for any_errors_fatal 27885 1726882554.59160: done checking for any_errors_fatal 27885 1726882554.59161: checking for max_fail_percentage 27885 1726882554.59162: done checking for max_fail_percentage 27885 1726882554.59163: checking to see if all hosts have failed and the running result is not ok 27885 1726882554.59164: done checking to see if all hosts have failed 27885 1726882554.59164: getting the remaining hosts for this loop 27885 1726882554.59166: done getting the remaining hosts for this loop 27885 1726882554.59169: getting the next task for host managed_node2 27885 1726882554.59179: done getting next task for host managed_node2 27885 1726882554.59181: ^ task is: TASK: meta (role_complete) 27885 1726882554.59185: ^ state is: HOST STATE: block=3, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882554.59200: getting variables 27885 1726882554.59202: in VariableManager get_vars() 27885 1726882554.59241: Calling all_inventory to load vars for managed_node2 27885 1726882554.59244: Calling groups_inventory to load vars for managed_node2 27885 1726882554.59246: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882554.59256: Calling all_plugins_play to load vars for managed_node2 27885 1726882554.59258: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882554.59261: Calling groups_plugins_play to load vars for managed_node2 27885 1726882554.60191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882554.61059: done with get_vars() 27885 1726882554.61075: done getting variables 27885 1726882554.61138: done queuing things up, now waiting for results queue to drain 27885 1726882554.61140: results queue empty 27885 1726882554.61141: checking for any_errors_fatal 27885 1726882554.61143: done checking for any_errors_fatal 27885 1726882554.61143: checking for max_fail_percentage 27885 1726882554.61144: done checking for max_fail_percentage 27885 1726882554.61144: checking to see if all hosts have failed and the running result is not ok 27885 1726882554.61145: done checking to see if all hosts have failed 27885 1726882554.61145: getting the remaining hosts for this loop 27885 1726882554.61146: done getting the remaining hosts for this loop 27885 1726882554.61148: getting the next task for host managed_node2 27885 1726882554.61150: done getting next task for host managed_node2 27885 1726882554.61152: ^ task is: TASK: Assert that the warning about specifying the route without the output device is logged for initscripts provider 27885 1726882554.61153: ^ state is: HOST STATE: block=3, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882554.61154: getting variables 27885 1726882554.61155: in VariableManager get_vars() 27885 1726882554.61164: Calling all_inventory to load vars for managed_node2 27885 1726882554.61166: Calling groups_inventory to load vars for managed_node2 27885 1726882554.61167: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882554.61170: Calling all_plugins_play to load vars for managed_node2 27885 1726882554.61172: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882554.61173: Calling groups_plugins_play to load vars for managed_node2 27885 1726882554.61800: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882554.62660: done with get_vars() 27885 1726882554.62674: done getting variables 27885 1726882554.62710: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the warning about specifying the route without the output device is logged for initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:122 Friday 20 September 2024 21:35:54 -0400 (0:00:00.398) 0:00:27.269 ****** 27885 1726882554.62730: entering _queue_task() for managed_node2/assert 27885 1726882554.63001: worker is 1 (out of 1 available) 27885 1726882554.63014: exiting _queue_task() for managed_node2/assert 27885 1726882554.63027: done queuing things up, now waiting for results queue to drain 27885 1726882554.63029: waiting for pending results... 27885 1726882554.63202: running TaskExecutor() for managed_node2/TASK: Assert that the warning about specifying the route without the output device is logged for initscripts provider 27885 1726882554.63269: in run() - task 12673a56-9f93-3fa5-01be-0000000000b1 27885 1726882554.63281: variable 'ansible_search_path' from source: unknown 27885 1726882554.63314: calling self._execute() 27885 1726882554.63395: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882554.63399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882554.63407: variable 'omit' from source: magic vars 27885 1726882554.63682: variable 'ansible_distribution_major_version' from source: facts 27885 1726882554.63697: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882554.63770: variable 'network_provider' from source: set_fact 27885 1726882554.63773: Evaluated conditional (network_provider == "initscripts"): False 27885 1726882554.63777: when evaluation is False, skipping this task 27885 1726882554.63780: _execute() done 27885 1726882554.63785: dumping result to json 27885 1726882554.63787: done dumping result, returning 27885 1726882554.63797: done running TaskExecutor() for managed_node2/TASK: Assert that the warning about specifying the route without the output device is logged for initscripts provider [12673a56-9f93-3fa5-01be-0000000000b1] 27885 1726882554.63804: sending task result for task 12673a56-9f93-3fa5-01be-0000000000b1 27885 1726882554.63892: done sending task result for task 12673a56-9f93-3fa5-01be-0000000000b1 27885 1726882554.63898: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 27885 1726882554.63952: no more pending results, returning what we have 27885 1726882554.63955: results queue empty 27885 1726882554.63957: checking for any_errors_fatal 27885 1726882554.63958: done checking for any_errors_fatal 27885 1726882554.63959: checking for max_fail_percentage 27885 1726882554.63961: done checking for max_fail_percentage 27885 1726882554.63961: checking to see if all hosts have failed and the running result is not ok 27885 1726882554.63962: done checking to see if all hosts have failed 27885 1726882554.63963: getting the remaining hosts for this loop 27885 1726882554.63965: done getting the remaining hosts for this loop 27885 1726882554.63970: getting the next task for host managed_node2 27885 1726882554.63976: done getting next task for host managed_node2 27885 1726882554.63979: ^ task is: TASK: Assert that no warning is logged for nm provider 27885 1726882554.63982: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882554.63985: getting variables 27885 1726882554.63987: in VariableManager get_vars() 27885 1726882554.64026: Calling all_inventory to load vars for managed_node2 27885 1726882554.64029: Calling groups_inventory to load vars for managed_node2 27885 1726882554.64031: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882554.64041: Calling all_plugins_play to load vars for managed_node2 27885 1726882554.64043: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882554.64046: Calling groups_plugins_play to load vars for managed_node2 27885 1726882554.68076: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882554.68931: done with get_vars() 27885 1726882554.68948: done getting variables 27885 1726882554.68984: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that no warning is logged for nm provider] ************************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:129 Friday 20 September 2024 21:35:54 -0400 (0:00:00.062) 0:00:27.332 ****** 27885 1726882554.69006: entering _queue_task() for managed_node2/assert 27885 1726882554.69278: worker is 1 (out of 1 available) 27885 1726882554.69297: exiting _queue_task() for managed_node2/assert 27885 1726882554.69309: done queuing things up, now waiting for results queue to drain 27885 1726882554.69311: waiting for pending results... 27885 1726882554.69507: running TaskExecutor() for managed_node2/TASK: Assert that no warning is logged for nm provider 27885 1726882554.69571: in run() - task 12673a56-9f93-3fa5-01be-0000000000b2 27885 1726882554.69585: variable 'ansible_search_path' from source: unknown 27885 1726882554.69621: calling self._execute() 27885 1726882554.69711: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882554.69715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882554.69725: variable 'omit' from source: magic vars 27885 1726882554.70029: variable 'ansible_distribution_major_version' from source: facts 27885 1726882554.70038: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882554.70119: variable 'network_provider' from source: set_fact 27885 1726882554.70123: Evaluated conditional (network_provider == "nm"): True 27885 1726882554.70130: variable 'omit' from source: magic vars 27885 1726882554.70145: variable 'omit' from source: magic vars 27885 1726882554.70171: variable 'omit' from source: magic vars 27885 1726882554.70207: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882554.70243: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882554.70259: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882554.70272: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882554.70282: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882554.70312: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882554.70315: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882554.70319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882554.70386: Set connection var ansible_pipelining to False 27885 1726882554.70390: Set connection var ansible_connection to ssh 27885 1726882554.70398: Set connection var ansible_timeout to 10 27885 1726882554.70401: Set connection var ansible_shell_type to sh 27885 1726882554.70407: Set connection var ansible_shell_executable to /bin/sh 27885 1726882554.70417: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882554.70434: variable 'ansible_shell_executable' from source: unknown 27885 1726882554.70437: variable 'ansible_connection' from source: unknown 27885 1726882554.70439: variable 'ansible_module_compression' from source: unknown 27885 1726882554.70442: variable 'ansible_shell_type' from source: unknown 27885 1726882554.70445: variable 'ansible_shell_executable' from source: unknown 27885 1726882554.70447: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882554.70449: variable 'ansible_pipelining' from source: unknown 27885 1726882554.70451: variable 'ansible_timeout' from source: unknown 27885 1726882554.70453: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882554.70557: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882554.70567: variable 'omit' from source: magic vars 27885 1726882554.70572: starting attempt loop 27885 1726882554.70574: running the handler 27885 1726882554.70685: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27885 1726882554.70962: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27885 1726882554.71114: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27885 1726882554.71117: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27885 1726882554.71119: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27885 1726882554.71204: variable '__network_connections_result' from source: set_fact 27885 1726882554.71234: Evaluated conditional (__network_connections_result.stderr is not search("")): True 27885 1726882554.71241: handler run complete 27885 1726882554.71244: attempt loop complete, returning result 27885 1726882554.71247: _execute() done 27885 1726882554.71249: dumping result to json 27885 1726882554.71254: done dumping result, returning 27885 1726882554.71261: done running TaskExecutor() for managed_node2/TASK: Assert that no warning is logged for nm provider [12673a56-9f93-3fa5-01be-0000000000b2] 27885 1726882554.71265: sending task result for task 12673a56-9f93-3fa5-01be-0000000000b2 ok: [managed_node2] => { "changed": false } MSG: All assertions passed 27885 1726882554.71515: no more pending results, returning what we have 27885 1726882554.71518: results queue empty 27885 1726882554.71519: checking for any_errors_fatal 27885 1726882554.71526: done checking for any_errors_fatal 27885 1726882554.71527: checking for max_fail_percentage 27885 1726882554.71529: done checking for max_fail_percentage 27885 1726882554.71529: checking to see if all hosts have failed and the running result is not ok 27885 1726882554.71530: done checking to see if all hosts have failed 27885 1726882554.71531: getting the remaining hosts for this loop 27885 1726882554.71532: done getting the remaining hosts for this loop 27885 1726882554.71535: getting the next task for host managed_node2 27885 1726882554.71543: done getting next task for host managed_node2 27885 1726882554.71545: ^ task is: TASK: Bring down test devices and profiles 27885 1726882554.71548: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882554.71552: getting variables 27885 1726882554.71576: in VariableManager get_vars() 27885 1726882554.71611: Calling all_inventory to load vars for managed_node2 27885 1726882554.71613: Calling groups_inventory to load vars for managed_node2 27885 1726882554.71615: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882554.71625: Calling all_plugins_play to load vars for managed_node2 27885 1726882554.71628: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882554.71630: Calling groups_plugins_play to load vars for managed_node2 27885 1726882554.72167: done sending task result for task 12673a56-9f93-3fa5-01be-0000000000b2 27885 1726882554.72170: WORKER PROCESS EXITING 27885 1726882554.72622: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882554.73488: done with get_vars() 27885 1726882554.73507: done getting variables TASK [Bring down test devices and profiles] ************************************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:140 Friday 20 September 2024 21:35:54 -0400 (0:00:00.045) 0:00:27.378 ****** 27885 1726882554.73575: entering _queue_task() for managed_node2/include_role 27885 1726882554.73577: Creating lock for include_role 27885 1726882554.73834: worker is 1 (out of 1 available) 27885 1726882554.73847: exiting _queue_task() for managed_node2/include_role 27885 1726882554.73859: done queuing things up, now waiting for results queue to drain 27885 1726882554.73860: waiting for pending results... 27885 1726882554.74082: running TaskExecutor() for managed_node2/TASK: Bring down test devices and profiles 27885 1726882554.74243: in run() - task 12673a56-9f93-3fa5-01be-0000000000b4 27885 1726882554.74247: variable 'ansible_search_path' from source: unknown 27885 1726882554.74250: calling self._execute() 27885 1726882554.74699: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882554.74703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882554.74706: variable 'omit' from source: magic vars 27885 1726882554.75045: variable 'ansible_distribution_major_version' from source: facts 27885 1726882554.75062: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882554.75073: _execute() done 27885 1726882554.75081: dumping result to json 27885 1726882554.75091: done dumping result, returning 27885 1726882554.75104: done running TaskExecutor() for managed_node2/TASK: Bring down test devices and profiles [12673a56-9f93-3fa5-01be-0000000000b4] 27885 1726882554.75113: sending task result for task 12673a56-9f93-3fa5-01be-0000000000b4 27885 1726882554.75291: no more pending results, returning what we have 27885 1726882554.75298: in VariableManager get_vars() 27885 1726882554.75344: Calling all_inventory to load vars for managed_node2 27885 1726882554.75347: Calling groups_inventory to load vars for managed_node2 27885 1726882554.75350: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882554.75363: Calling all_plugins_play to load vars for managed_node2 27885 1726882554.75366: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882554.75368: Calling groups_plugins_play to load vars for managed_node2 27885 1726882554.76006: done sending task result for task 12673a56-9f93-3fa5-01be-0000000000b4 27885 1726882554.76010: WORKER PROCESS EXITING 27885 1726882554.76807: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882554.78436: done with get_vars() 27885 1726882554.78453: variable 'ansible_search_path' from source: unknown 27885 1726882554.78698: variable 'omit' from source: magic vars 27885 1726882554.78731: variable 'omit' from source: magic vars 27885 1726882554.78746: variable 'omit' from source: magic vars 27885 1726882554.78750: we have included files to process 27885 1726882554.78751: generating all_blocks data 27885 1726882554.78753: done generating all_blocks data 27885 1726882554.78758: processing included file: fedora.linux_system_roles.network 27885 1726882554.78778: in VariableManager get_vars() 27885 1726882554.78799: done with get_vars() 27885 1726882554.78827: in VariableManager get_vars() 27885 1726882554.78845: done with get_vars() 27885 1726882554.78886: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 27885 1726882554.79012: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 27885 1726882554.79092: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 27885 1726882554.79569: in VariableManager get_vars() 27885 1726882554.79597: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 27885 1726882554.81442: iterating over new_blocks loaded from include file 27885 1726882554.81445: in VariableManager get_vars() 27885 1726882554.81463: done with get_vars() 27885 1726882554.81465: filtering new block on tags 27885 1726882554.81678: done filtering new block on tags 27885 1726882554.81682: in VariableManager get_vars() 27885 1726882554.81701: done with get_vars() 27885 1726882554.81703: filtering new block on tags 27885 1726882554.81717: done filtering new block on tags 27885 1726882554.81719: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 27885 1726882554.81724: extending task lists for all hosts with included blocks 27885 1726882554.81933: done extending task lists 27885 1726882554.81935: done processing included files 27885 1726882554.81935: results queue empty 27885 1726882554.81936: checking for any_errors_fatal 27885 1726882554.81939: done checking for any_errors_fatal 27885 1726882554.81940: checking for max_fail_percentage 27885 1726882554.81941: done checking for max_fail_percentage 27885 1726882554.81942: checking to see if all hosts have failed and the running result is not ok 27885 1726882554.81943: done checking to see if all hosts have failed 27885 1726882554.81943: getting the remaining hosts for this loop 27885 1726882554.81944: done getting the remaining hosts for this loop 27885 1726882554.81947: getting the next task for host managed_node2 27885 1726882554.81951: done getting next task for host managed_node2 27885 1726882554.81954: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 27885 1726882554.81957: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882554.81966: getting variables 27885 1726882554.81967: in VariableManager get_vars() 27885 1726882554.81982: Calling all_inventory to load vars for managed_node2 27885 1726882554.81985: Calling groups_inventory to load vars for managed_node2 27885 1726882554.81987: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882554.81997: Calling all_plugins_play to load vars for managed_node2 27885 1726882554.81999: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882554.82003: Calling groups_plugins_play to load vars for managed_node2 27885 1726882554.83336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882554.84978: done with get_vars() 27885 1726882554.85003: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:35:54 -0400 (0:00:00.115) 0:00:27.493 ****** 27885 1726882554.85079: entering _queue_task() for managed_node2/include_tasks 27885 1726882554.85450: worker is 1 (out of 1 available) 27885 1726882554.85465: exiting _queue_task() for managed_node2/include_tasks 27885 1726882554.85478: done queuing things up, now waiting for results queue to drain 27885 1726882554.85479: waiting for pending results... 27885 1726882554.85729: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 27885 1726882554.85868: in run() - task 12673a56-9f93-3fa5-01be-000000000641 27885 1726882554.85895: variable 'ansible_search_path' from source: unknown 27885 1726882554.85904: variable 'ansible_search_path' from source: unknown 27885 1726882554.85946: calling self._execute() 27885 1726882554.86050: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882554.86062: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882554.86074: variable 'omit' from source: magic vars 27885 1726882554.86511: variable 'ansible_distribution_major_version' from source: facts 27885 1726882554.86643: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882554.86651: _execute() done 27885 1726882554.86653: dumping result to json 27885 1726882554.86656: done dumping result, returning 27885 1726882554.86658: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-3fa5-01be-000000000641] 27885 1726882554.86660: sending task result for task 12673a56-9f93-3fa5-01be-000000000641 27885 1726882554.86736: done sending task result for task 12673a56-9f93-3fa5-01be-000000000641 27885 1726882554.86739: WORKER PROCESS EXITING 27885 1726882554.86796: no more pending results, returning what we have 27885 1726882554.86801: in VariableManager get_vars() 27885 1726882554.86845: Calling all_inventory to load vars for managed_node2 27885 1726882554.86848: Calling groups_inventory to load vars for managed_node2 27885 1726882554.86850: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882554.86862: Calling all_plugins_play to load vars for managed_node2 27885 1726882554.86865: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882554.86868: Calling groups_plugins_play to load vars for managed_node2 27885 1726882554.88185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882554.90159: done with get_vars() 27885 1726882554.90181: variable 'ansible_search_path' from source: unknown 27885 1726882554.90182: variable 'ansible_search_path' from source: unknown 27885 1726882554.90354: we have included files to process 27885 1726882554.90355: generating all_blocks data 27885 1726882554.90357: done generating all_blocks data 27885 1726882554.90361: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 27885 1726882554.90362: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 27885 1726882554.90428: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 27885 1726882554.91863: done processing included file 27885 1726882554.91865: iterating over new_blocks loaded from include file 27885 1726882554.91866: in VariableManager get_vars() 27885 1726882554.91898: done with get_vars() 27885 1726882554.91900: filtering new block on tags 27885 1726882554.91932: done filtering new block on tags 27885 1726882554.91935: in VariableManager get_vars() 27885 1726882554.91959: done with get_vars() 27885 1726882554.91960: filtering new block on tags 27885 1726882554.92071: done filtering new block on tags 27885 1726882554.92074: in VariableManager get_vars() 27885 1726882554.92144: done with get_vars() 27885 1726882554.92146: filtering new block on tags 27885 1726882554.92183: done filtering new block on tags 27885 1726882554.92186: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 27885 1726882554.92196: extending task lists for all hosts with included blocks 27885 1726882554.95626: done extending task lists 27885 1726882554.95628: done processing included files 27885 1726882554.95629: results queue empty 27885 1726882554.95630: checking for any_errors_fatal 27885 1726882554.95633: done checking for any_errors_fatal 27885 1726882554.95633: checking for max_fail_percentage 27885 1726882554.95635: done checking for max_fail_percentage 27885 1726882554.95635: checking to see if all hosts have failed and the running result is not ok 27885 1726882554.95637: done checking to see if all hosts have failed 27885 1726882554.95637: getting the remaining hosts for this loop 27885 1726882554.95638: done getting the remaining hosts for this loop 27885 1726882554.95641: getting the next task for host managed_node2 27885 1726882554.95646: done getting next task for host managed_node2 27885 1726882554.95648: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 27885 1726882554.95652: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882554.95662: getting variables 27885 1726882554.95663: in VariableManager get_vars() 27885 1726882554.95998: Calling all_inventory to load vars for managed_node2 27885 1726882554.96002: Calling groups_inventory to load vars for managed_node2 27885 1726882554.96004: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882554.96011: Calling all_plugins_play to load vars for managed_node2 27885 1726882554.96018: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882554.96021: Calling groups_plugins_play to load vars for managed_node2 27885 1726882554.98631: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882555.00277: done with get_vars() 27885 1726882555.00313: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:35:55 -0400 (0:00:00.153) 0:00:27.646 ****** 27885 1726882555.00403: entering _queue_task() for managed_node2/setup 27885 1726882555.00784: worker is 1 (out of 1 available) 27885 1726882555.01001: exiting _queue_task() for managed_node2/setup 27885 1726882555.01011: done queuing things up, now waiting for results queue to drain 27885 1726882555.01015: waiting for pending results... 27885 1726882555.01117: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 27885 1726882555.01350: in run() - task 12673a56-9f93-3fa5-01be-0000000006a7 27885 1726882555.01354: variable 'ansible_search_path' from source: unknown 27885 1726882555.01357: variable 'ansible_search_path' from source: unknown 27885 1726882555.01360: calling self._execute() 27885 1726882555.01461: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882555.01472: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882555.01483: variable 'omit' from source: magic vars 27885 1726882555.02002: variable 'ansible_distribution_major_version' from source: facts 27885 1726882555.02025: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882555.02545: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27885 1726882555.05148: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27885 1726882555.05222: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27885 1726882555.05267: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27885 1726882555.05313: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27885 1726882555.05351: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27885 1726882555.05435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882555.05557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882555.05561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882555.05563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882555.05569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882555.05629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882555.05656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882555.05692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882555.05737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882555.05755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882555.05920: variable '__network_required_facts' from source: role '' defaults 27885 1726882555.05934: variable 'ansible_facts' from source: unknown 27885 1726882555.06932: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 27885 1726882555.06941: when evaluation is False, skipping this task 27885 1726882555.06949: _execute() done 27885 1726882555.06957: dumping result to json 27885 1726882555.06970: done dumping result, returning 27885 1726882555.06982: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12673a56-9f93-3fa5-01be-0000000006a7] 27885 1726882555.07079: sending task result for task 12673a56-9f93-3fa5-01be-0000000006a7 27885 1726882555.07158: done sending task result for task 12673a56-9f93-3fa5-01be-0000000006a7 27885 1726882555.07162: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 27885 1726882555.07234: no more pending results, returning what we have 27885 1726882555.07238: results queue empty 27885 1726882555.07239: checking for any_errors_fatal 27885 1726882555.07241: done checking for any_errors_fatal 27885 1726882555.07242: checking for max_fail_percentage 27885 1726882555.07244: done checking for max_fail_percentage 27885 1726882555.07245: checking to see if all hosts have failed and the running result is not ok 27885 1726882555.07246: done checking to see if all hosts have failed 27885 1726882555.07246: getting the remaining hosts for this loop 27885 1726882555.07249: done getting the remaining hosts for this loop 27885 1726882555.07253: getting the next task for host managed_node2 27885 1726882555.07264: done getting next task for host managed_node2 27885 1726882555.07269: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 27885 1726882555.07275: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882555.07499: getting variables 27885 1726882555.07502: in VariableManager get_vars() 27885 1726882555.07541: Calling all_inventory to load vars for managed_node2 27885 1726882555.07544: Calling groups_inventory to load vars for managed_node2 27885 1726882555.07547: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882555.07556: Calling all_plugins_play to load vars for managed_node2 27885 1726882555.07560: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882555.07563: Calling groups_plugins_play to load vars for managed_node2 27885 1726882555.08755: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882555.10672: done with get_vars() 27885 1726882555.10707: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:35:55 -0400 (0:00:00.103) 0:00:27.750 ****** 27885 1726882555.10787: entering _queue_task() for managed_node2/stat 27885 1726882555.11046: worker is 1 (out of 1 available) 27885 1726882555.11059: exiting _queue_task() for managed_node2/stat 27885 1726882555.11071: done queuing things up, now waiting for results queue to drain 27885 1726882555.11073: waiting for pending results... 27885 1726882555.11251: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 27885 1726882555.11349: in run() - task 12673a56-9f93-3fa5-01be-0000000006a9 27885 1726882555.11362: variable 'ansible_search_path' from source: unknown 27885 1726882555.11365: variable 'ansible_search_path' from source: unknown 27885 1726882555.11397: calling self._execute() 27885 1726882555.11473: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882555.11476: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882555.11484: variable 'omit' from source: magic vars 27885 1726882555.11988: variable 'ansible_distribution_major_version' from source: facts 27885 1726882555.11996: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882555.12030: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27885 1726882555.12288: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27885 1726882555.12335: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27885 1726882555.12367: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27885 1726882555.12397: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27885 1726882555.12486: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27885 1726882555.12512: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27885 1726882555.12545: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882555.12604: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27885 1726882555.12668: variable '__network_is_ostree' from source: set_fact 27885 1726882555.12766: Evaluated conditional (not __network_is_ostree is defined): False 27885 1726882555.12801: when evaluation is False, skipping this task 27885 1726882555.13047: _execute() done 27885 1726882555.13050: dumping result to json 27885 1726882555.13052: done dumping result, returning 27885 1726882555.13054: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [12673a56-9f93-3fa5-01be-0000000006a9] 27885 1726882555.13056: sending task result for task 12673a56-9f93-3fa5-01be-0000000006a9 27885 1726882555.13120: done sending task result for task 12673a56-9f93-3fa5-01be-0000000006a9 27885 1726882555.13122: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 27885 1726882555.13185: no more pending results, returning what we have 27885 1726882555.13192: results queue empty 27885 1726882555.13196: checking for any_errors_fatal 27885 1726882555.13205: done checking for any_errors_fatal 27885 1726882555.13206: checking for max_fail_percentage 27885 1726882555.13208: done checking for max_fail_percentage 27885 1726882555.13209: checking to see if all hosts have failed and the running result is not ok 27885 1726882555.13210: done checking to see if all hosts have failed 27885 1726882555.13210: getting the remaining hosts for this loop 27885 1726882555.13212: done getting the remaining hosts for this loop 27885 1726882555.13217: getting the next task for host managed_node2 27885 1726882555.13226: done getting next task for host managed_node2 27885 1726882555.13230: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 27885 1726882555.13235: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882555.13255: getting variables 27885 1726882555.13257: in VariableManager get_vars() 27885 1726882555.13475: Calling all_inventory to load vars for managed_node2 27885 1726882555.13478: Calling groups_inventory to load vars for managed_node2 27885 1726882555.13481: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882555.13514: Calling all_plugins_play to load vars for managed_node2 27885 1726882555.13519: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882555.13523: Calling groups_plugins_play to load vars for managed_node2 27885 1726882555.15129: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882555.16206: done with get_vars() 27885 1726882555.16226: done getting variables 27885 1726882555.16295: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:35:55 -0400 (0:00:00.055) 0:00:27.805 ****** 27885 1726882555.16329: entering _queue_task() for managed_node2/set_fact 27885 1726882555.16757: worker is 1 (out of 1 available) 27885 1726882555.16771: exiting _queue_task() for managed_node2/set_fact 27885 1726882555.16784: done queuing things up, now waiting for results queue to drain 27885 1726882555.16785: waiting for pending results... 27885 1726882555.17311: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 27885 1726882555.17450: in run() - task 12673a56-9f93-3fa5-01be-0000000006aa 27885 1726882555.17471: variable 'ansible_search_path' from source: unknown 27885 1726882555.17479: variable 'ansible_search_path' from source: unknown 27885 1726882555.17537: calling self._execute() 27885 1726882555.17659: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882555.17674: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882555.17696: variable 'omit' from source: magic vars 27885 1726882555.18184: variable 'ansible_distribution_major_version' from source: facts 27885 1726882555.18197: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882555.18316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27885 1726882555.18509: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27885 1726882555.18540: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27885 1726882555.18564: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27885 1726882555.18590: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27885 1726882555.18655: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27885 1726882555.18672: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27885 1726882555.18691: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882555.18715: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27885 1726882555.18778: variable '__network_is_ostree' from source: set_fact 27885 1726882555.18784: Evaluated conditional (not __network_is_ostree is defined): False 27885 1726882555.18787: when evaluation is False, skipping this task 27885 1726882555.18791: _execute() done 27885 1726882555.18797: dumping result to json 27885 1726882555.18801: done dumping result, returning 27885 1726882555.18811: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12673a56-9f93-3fa5-01be-0000000006aa] 27885 1726882555.18815: sending task result for task 12673a56-9f93-3fa5-01be-0000000006aa 27885 1726882555.18895: done sending task result for task 12673a56-9f93-3fa5-01be-0000000006aa 27885 1726882555.18899: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 27885 1726882555.18945: no more pending results, returning what we have 27885 1726882555.18949: results queue empty 27885 1726882555.18950: checking for any_errors_fatal 27885 1726882555.18958: done checking for any_errors_fatal 27885 1726882555.18959: checking for max_fail_percentage 27885 1726882555.18960: done checking for max_fail_percentage 27885 1726882555.18961: checking to see if all hosts have failed and the running result is not ok 27885 1726882555.18962: done checking to see if all hosts have failed 27885 1726882555.18963: getting the remaining hosts for this loop 27885 1726882555.18965: done getting the remaining hosts for this loop 27885 1726882555.18968: getting the next task for host managed_node2 27885 1726882555.18978: done getting next task for host managed_node2 27885 1726882555.18981: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 27885 1726882555.18986: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882555.19007: getting variables 27885 1726882555.19009: in VariableManager get_vars() 27885 1726882555.19046: Calling all_inventory to load vars for managed_node2 27885 1726882555.19049: Calling groups_inventory to load vars for managed_node2 27885 1726882555.19050: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882555.19058: Calling all_plugins_play to load vars for managed_node2 27885 1726882555.19060: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882555.19062: Calling groups_plugins_play to load vars for managed_node2 27885 1726882555.20137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882555.21306: done with get_vars() 27885 1726882555.21323: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:35:55 -0400 (0:00:00.050) 0:00:27.856 ****** 27885 1726882555.21388: entering _queue_task() for managed_node2/service_facts 27885 1726882555.21625: worker is 1 (out of 1 available) 27885 1726882555.21636: exiting _queue_task() for managed_node2/service_facts 27885 1726882555.21650: done queuing things up, now waiting for results queue to drain 27885 1726882555.21651: waiting for pending results... 27885 1726882555.21829: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 27885 1726882555.21913: in run() - task 12673a56-9f93-3fa5-01be-0000000006ac 27885 1726882555.21925: variable 'ansible_search_path' from source: unknown 27885 1726882555.21928: variable 'ansible_search_path' from source: unknown 27885 1726882555.21956: calling self._execute() 27885 1726882555.22036: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882555.22040: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882555.22050: variable 'omit' from source: magic vars 27885 1726882555.22329: variable 'ansible_distribution_major_version' from source: facts 27885 1726882555.22339: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882555.22345: variable 'omit' from source: magic vars 27885 1726882555.22398: variable 'omit' from source: magic vars 27885 1726882555.22422: variable 'omit' from source: magic vars 27885 1726882555.22454: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882555.22482: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882555.22500: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882555.22514: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882555.22524: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882555.22550: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882555.22554: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882555.22557: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882555.22627: Set connection var ansible_pipelining to False 27885 1726882555.22630: Set connection var ansible_connection to ssh 27885 1726882555.22636: Set connection var ansible_timeout to 10 27885 1726882555.22638: Set connection var ansible_shell_type to sh 27885 1726882555.22647: Set connection var ansible_shell_executable to /bin/sh 27885 1726882555.22650: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882555.22667: variable 'ansible_shell_executable' from source: unknown 27885 1726882555.22670: variable 'ansible_connection' from source: unknown 27885 1726882555.22673: variable 'ansible_module_compression' from source: unknown 27885 1726882555.22676: variable 'ansible_shell_type' from source: unknown 27885 1726882555.22678: variable 'ansible_shell_executable' from source: unknown 27885 1726882555.22680: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882555.22683: variable 'ansible_pipelining' from source: unknown 27885 1726882555.22686: variable 'ansible_timeout' from source: unknown 27885 1726882555.22688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882555.22834: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 27885 1726882555.22842: variable 'omit' from source: magic vars 27885 1726882555.22847: starting attempt loop 27885 1726882555.22849: running the handler 27885 1726882555.22866: _low_level_execute_command(): starting 27885 1726882555.22870: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27885 1726882555.23379: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882555.23383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration <<< 27885 1726882555.23386: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 27885 1726882555.23389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882555.23439: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882555.23442: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882555.23445: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882555.23521: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882555.25188: stdout chunk (state=3): >>>/root <<< 27885 1726882555.25287: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882555.25317: stderr chunk (state=3): >>><<< 27885 1726882555.25321: stdout chunk (state=3): >>><<< 27885 1726882555.25340: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882555.25351: _low_level_execute_command(): starting 27885 1726882555.25357: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882555.2534006-29180-228910724029123 `" && echo ansible-tmp-1726882555.2534006-29180-228910724029123="` echo /root/.ansible/tmp/ansible-tmp-1726882555.2534006-29180-228910724029123 `" ) && sleep 0' 27885 1726882555.25774: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882555.25778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882555.25788: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882555.25795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882555.25829: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882555.25833: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882555.25904: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882555.27779: stdout chunk (state=3): >>>ansible-tmp-1726882555.2534006-29180-228910724029123=/root/.ansible/tmp/ansible-tmp-1726882555.2534006-29180-228910724029123 <<< 27885 1726882555.27882: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882555.27909: stderr chunk (state=3): >>><<< 27885 1726882555.27912: stdout chunk (state=3): >>><<< 27885 1726882555.27928: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882555.2534006-29180-228910724029123=/root/.ansible/tmp/ansible-tmp-1726882555.2534006-29180-228910724029123 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882555.27961: variable 'ansible_module_compression' from source: unknown 27885 1726882555.27997: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27885_1t3g5hz/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 27885 1726882555.28032: variable 'ansible_facts' from source: unknown 27885 1726882555.28082: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882555.2534006-29180-228910724029123/AnsiballZ_service_facts.py 27885 1726882555.28177: Sending initial data 27885 1726882555.28180: Sent initial data (162 bytes) 27885 1726882555.28578: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882555.28615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 27885 1726882555.28618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882555.28620: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 27885 1726882555.28622: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882555.28664: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882555.28667: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882555.28734: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882555.30263: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27885 1726882555.30266: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27885 1726882555.30324: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27885 1726882555.30382: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmplckmkgcs /root/.ansible/tmp/ansible-tmp-1726882555.2534006-29180-228910724029123/AnsiballZ_service_facts.py <<< 27885 1726882555.30386: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882555.2534006-29180-228910724029123/AnsiballZ_service_facts.py" <<< 27885 1726882555.30445: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmplckmkgcs" to remote "/root/.ansible/tmp/ansible-tmp-1726882555.2534006-29180-228910724029123/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882555.2534006-29180-228910724029123/AnsiballZ_service_facts.py" <<< 27885 1726882555.31067: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882555.31105: stderr chunk (state=3): >>><<< 27885 1726882555.31108: stdout chunk (state=3): >>><<< 27885 1726882555.31162: done transferring module to remote 27885 1726882555.31170: _low_level_execute_command(): starting 27885 1726882555.31174: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882555.2534006-29180-228910724029123/ /root/.ansible/tmp/ansible-tmp-1726882555.2534006-29180-228910724029123/AnsiballZ_service_facts.py && sleep 0' 27885 1726882555.31561: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882555.31564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882555.31597: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 27885 1726882555.31600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 27885 1726882555.31602: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882555.31607: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882555.31658: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882555.31661: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882555.31726: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882555.33473: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882555.33490: stderr chunk (state=3): >>><<< 27885 1726882555.33498: stdout chunk (state=3): >>><<< 27885 1726882555.33509: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882555.33511: _low_level_execute_command(): starting 27885 1726882555.33516: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882555.2534006-29180-228910724029123/AnsiballZ_service_facts.py && sleep 0' 27885 1726882555.33912: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882555.33916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882555.33918: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration <<< 27885 1726882555.33921: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882555.33955: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882555.33970: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882555.34036: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882556.83903: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 27885 1726882556.83926: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.<<< 27885 1726882556.83946: stdout chunk (state=3): >>>service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "st<<< 27885 1726882556.83975: stdout chunk (state=3): >>>opped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "st<<< 27885 1726882556.83981: stdout chunk (state=3): >>>atic", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 27885 1726882556.85502: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 27885 1726882556.85522: stderr chunk (state=3): >>><<< 27885 1726882556.85525: stdout chunk (state=3): >>><<< 27885 1726882556.85701: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 27885 1726882556.86326: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882555.2534006-29180-228910724029123/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27885 1726882556.86343: _low_level_execute_command(): starting 27885 1726882556.86352: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882555.2534006-29180-228910724029123/ > /dev/null 2>&1 && sleep 0' 27885 1726882556.86899: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882556.86916: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882556.86930: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882556.86940: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882556.86982: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882556.87006: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882556.87062: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882556.88863: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882556.88870: stderr chunk (state=3): >>><<< 27885 1726882556.88872: stdout chunk (state=3): >>><<< 27885 1726882556.88884: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882556.88890: handler run complete 27885 1726882556.89076: variable 'ansible_facts' from source: unknown 27885 1726882556.89246: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882556.89681: variable 'ansible_facts' from source: unknown 27885 1726882556.89843: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882556.90013: attempt loop complete, returning result 27885 1726882556.90018: _execute() done 27885 1726882556.90020: dumping result to json 27885 1726882556.90075: done dumping result, returning 27885 1726882556.90083: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [12673a56-9f93-3fa5-01be-0000000006ac] 27885 1726882556.90088: sending task result for task 12673a56-9f93-3fa5-01be-0000000006ac ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 27885 1726882556.90852: no more pending results, returning what we have 27885 1726882556.90855: results queue empty 27885 1726882556.90855: checking for any_errors_fatal 27885 1726882556.90860: done checking for any_errors_fatal 27885 1726882556.90860: checking for max_fail_percentage 27885 1726882556.90861: done checking for max_fail_percentage 27885 1726882556.90862: checking to see if all hosts have failed and the running result is not ok 27885 1726882556.90863: done checking to see if all hosts have failed 27885 1726882556.90864: getting the remaining hosts for this loop 27885 1726882556.90865: done getting the remaining hosts for this loop 27885 1726882556.90867: getting the next task for host managed_node2 27885 1726882556.90873: done getting next task for host managed_node2 27885 1726882556.90876: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 27885 1726882556.90885: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882556.90901: done sending task result for task 12673a56-9f93-3fa5-01be-0000000006ac 27885 1726882556.90904: WORKER PROCESS EXITING 27885 1726882556.90909: getting variables 27885 1726882556.90910: in VariableManager get_vars() 27885 1726882556.90934: Calling all_inventory to load vars for managed_node2 27885 1726882556.90936: Calling groups_inventory to load vars for managed_node2 27885 1726882556.90937: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882556.90944: Calling all_plugins_play to load vars for managed_node2 27885 1726882556.90945: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882556.90947: Calling groups_plugins_play to load vars for managed_node2 27885 1726882556.91739: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882556.93065: done with get_vars() 27885 1726882556.93085: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:35:56 -0400 (0:00:01.717) 0:00:29.574 ****** 27885 1726882556.93158: entering _queue_task() for managed_node2/package_facts 27885 1726882556.93396: worker is 1 (out of 1 available) 27885 1726882556.93410: exiting _queue_task() for managed_node2/package_facts 27885 1726882556.93421: done queuing things up, now waiting for results queue to drain 27885 1726882556.93423: waiting for pending results... 27885 1726882556.93600: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 27885 1726882556.93699: in run() - task 12673a56-9f93-3fa5-01be-0000000006ad 27885 1726882556.93708: variable 'ansible_search_path' from source: unknown 27885 1726882556.93712: variable 'ansible_search_path' from source: unknown 27885 1726882556.93739: calling self._execute() 27885 1726882556.93818: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882556.93824: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882556.93832: variable 'omit' from source: magic vars 27885 1726882556.94115: variable 'ansible_distribution_major_version' from source: facts 27885 1726882556.94125: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882556.94131: variable 'omit' from source: magic vars 27885 1726882556.94173: variable 'omit' from source: magic vars 27885 1726882556.94202: variable 'omit' from source: magic vars 27885 1726882556.94234: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882556.94261: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882556.94276: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882556.94296: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882556.94305: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882556.94334: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882556.94338: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882556.94341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882556.94415: Set connection var ansible_pipelining to False 27885 1726882556.94420: Set connection var ansible_connection to ssh 27885 1726882556.94426: Set connection var ansible_timeout to 10 27885 1726882556.94429: Set connection var ansible_shell_type to sh 27885 1726882556.94435: Set connection var ansible_shell_executable to /bin/sh 27885 1726882556.94440: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882556.94458: variable 'ansible_shell_executable' from source: unknown 27885 1726882556.94461: variable 'ansible_connection' from source: unknown 27885 1726882556.94464: variable 'ansible_module_compression' from source: unknown 27885 1726882556.94466: variable 'ansible_shell_type' from source: unknown 27885 1726882556.94470: variable 'ansible_shell_executable' from source: unknown 27885 1726882556.94472: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882556.94474: variable 'ansible_pipelining' from source: unknown 27885 1726882556.94476: variable 'ansible_timeout' from source: unknown 27885 1726882556.94479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882556.94627: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 27885 1726882556.94639: variable 'omit' from source: magic vars 27885 1726882556.94642: starting attempt loop 27885 1726882556.94645: running the handler 27885 1726882556.94657: _low_level_execute_command(): starting 27885 1726882556.94664: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27885 1726882556.95170: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882556.95173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882556.95179: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 27885 1726882556.95183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882556.95237: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882556.95240: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882556.95243: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882556.95315: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882556.96910: stdout chunk (state=3): >>>/root <<< 27885 1726882556.97006: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882556.97047: stderr chunk (state=3): >>><<< 27885 1726882556.97050: stdout chunk (state=3): >>><<< 27885 1726882556.97062: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882556.97099: _low_level_execute_command(): starting 27885 1726882556.97102: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882556.9706802-29244-20951543216461 `" && echo ansible-tmp-1726882556.9706802-29244-20951543216461="` echo /root/.ansible/tmp/ansible-tmp-1726882556.9706802-29244-20951543216461 `" ) && sleep 0' 27885 1726882556.97526: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882556.97529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882556.97532: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882556.97541: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration <<< 27885 1726882556.97543: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882556.97546: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882556.97585: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882556.97588: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882556.97655: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882556.99514: stdout chunk (state=3): >>>ansible-tmp-1726882556.9706802-29244-20951543216461=/root/.ansible/tmp/ansible-tmp-1726882556.9706802-29244-20951543216461 <<< 27885 1726882556.99627: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882556.99650: stderr chunk (state=3): >>><<< 27885 1726882556.99653: stdout chunk (state=3): >>><<< 27885 1726882556.99666: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882556.9706802-29244-20951543216461=/root/.ansible/tmp/ansible-tmp-1726882556.9706802-29244-20951543216461 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882556.99704: variable 'ansible_module_compression' from source: unknown 27885 1726882556.99745: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27885_1t3g5hz/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 27885 1726882556.99788: variable 'ansible_facts' from source: unknown 27885 1726882556.99905: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882556.9706802-29244-20951543216461/AnsiballZ_package_facts.py 27885 1726882557.00003: Sending initial data 27885 1726882557.00007: Sent initial data (161 bytes) 27885 1726882557.00447: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882557.00450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 27885 1726882557.00454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 27885 1726882557.00457: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882557.00459: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882557.00511: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882557.00516: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882557.00575: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882557.02094: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27885 1726882557.02147: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27885 1726882557.02210: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmp0x5j7gma /root/.ansible/tmp/ansible-tmp-1726882556.9706802-29244-20951543216461/AnsiballZ_package_facts.py <<< 27885 1726882557.02216: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882556.9706802-29244-20951543216461/AnsiballZ_package_facts.py" <<< 27885 1726882557.02271: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmp0x5j7gma" to remote "/root/.ansible/tmp/ansible-tmp-1726882556.9706802-29244-20951543216461/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882556.9706802-29244-20951543216461/AnsiballZ_package_facts.py" <<< 27885 1726882557.03426: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882557.03585: stderr chunk (state=3): >>><<< 27885 1726882557.03588: stdout chunk (state=3): >>><<< 27885 1726882557.03591: done transferring module to remote 27885 1726882557.03600: _low_level_execute_command(): starting 27885 1726882557.03604: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882556.9706802-29244-20951543216461/ /root/.ansible/tmp/ansible-tmp-1726882556.9706802-29244-20951543216461/AnsiballZ_package_facts.py && sleep 0' 27885 1726882557.04213: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882557.04217: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882557.04260: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882557.04278: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882557.04310: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882557.04406: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882557.06170: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882557.06181: stdout chunk (state=3): >>><<< 27885 1726882557.06208: stderr chunk (state=3): >>><<< 27885 1726882557.06299: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882557.06303: _low_level_execute_command(): starting 27885 1726882557.06306: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882556.9706802-29244-20951543216461/AnsiballZ_package_facts.py && sleep 0' 27885 1726882557.06801: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882557.06817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882557.06835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882557.06871: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882557.06884: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882557.06956: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882557.50686: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "ep<<< 27885 1726882557.50822: stdout chunk (state=3): >>>och": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.1<<< 27885 1726882557.50837: stdout chunk (state=3): >>>9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 27885 1726882557.52501: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882557.52572: stderr chunk (state=3): >>>Shared connection to 10.31.14.69 closed. <<< 27885 1726882557.52585: stderr chunk (state=3): >>><<< 27885 1726882557.52597: stdout chunk (state=3): >>><<< 27885 1726882557.52803: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 27885 1726882557.55018: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882556.9706802-29244-20951543216461/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27885 1726882557.55035: _low_level_execute_command(): starting 27885 1726882557.55038: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882556.9706802-29244-20951543216461/ > /dev/null 2>&1 && sleep 0' 27885 1726882557.55489: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882557.55494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882557.55497: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 27885 1726882557.55499: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882557.55501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882557.55555: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882557.55565: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882557.55567: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882557.55623: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882557.57516: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882557.57519: stdout chunk (state=3): >>><<< 27885 1726882557.57522: stderr chunk (state=3): >>><<< 27885 1726882557.57525: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882557.57698: handler run complete 27885 1726882557.58339: variable 'ansible_facts' from source: unknown 27885 1726882557.58623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882557.59701: variable 'ansible_facts' from source: unknown 27885 1726882557.60060: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882557.60756: attempt loop complete, returning result 27885 1726882557.60767: _execute() done 27885 1726882557.60770: dumping result to json 27885 1726882557.60974: done dumping result, returning 27885 1726882557.61012: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [12673a56-9f93-3fa5-01be-0000000006ad] 27885 1726882557.61017: sending task result for task 12673a56-9f93-3fa5-01be-0000000006ad 27885 1726882557.62282: done sending task result for task 12673a56-9f93-3fa5-01be-0000000006ad 27885 1726882557.62285: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 27885 1726882557.62370: no more pending results, returning what we have 27885 1726882557.62372: results queue empty 27885 1726882557.62372: checking for any_errors_fatal 27885 1726882557.62376: done checking for any_errors_fatal 27885 1726882557.62376: checking for max_fail_percentage 27885 1726882557.62377: done checking for max_fail_percentage 27885 1726882557.62378: checking to see if all hosts have failed and the running result is not ok 27885 1726882557.62378: done checking to see if all hosts have failed 27885 1726882557.62379: getting the remaining hosts for this loop 27885 1726882557.62380: done getting the remaining hosts for this loop 27885 1726882557.62382: getting the next task for host managed_node2 27885 1726882557.62387: done getting next task for host managed_node2 27885 1726882557.62392: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 27885 1726882557.62397: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882557.62405: getting variables 27885 1726882557.62406: in VariableManager get_vars() 27885 1726882557.62431: Calling all_inventory to load vars for managed_node2 27885 1726882557.62433: Calling groups_inventory to load vars for managed_node2 27885 1726882557.62434: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882557.62441: Calling all_plugins_play to load vars for managed_node2 27885 1726882557.62443: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882557.62446: Calling groups_plugins_play to load vars for managed_node2 27885 1726882557.63167: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882557.64031: done with get_vars() 27885 1726882557.64046: done getting variables 27885 1726882557.64090: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:35:57 -0400 (0:00:00.709) 0:00:30.283 ****** 27885 1726882557.64120: entering _queue_task() for managed_node2/debug 27885 1726882557.64352: worker is 1 (out of 1 available) 27885 1726882557.64366: exiting _queue_task() for managed_node2/debug 27885 1726882557.64379: done queuing things up, now waiting for results queue to drain 27885 1726882557.64380: waiting for pending results... 27885 1726882557.64562: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 27885 1726882557.64649: in run() - task 12673a56-9f93-3fa5-01be-000000000642 27885 1726882557.64661: variable 'ansible_search_path' from source: unknown 27885 1726882557.64665: variable 'ansible_search_path' from source: unknown 27885 1726882557.64692: calling self._execute() 27885 1726882557.64774: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882557.64777: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882557.64786: variable 'omit' from source: magic vars 27885 1726882557.65062: variable 'ansible_distribution_major_version' from source: facts 27885 1726882557.65071: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882557.65078: variable 'omit' from source: magic vars 27885 1726882557.65115: variable 'omit' from source: magic vars 27885 1726882557.65184: variable 'network_provider' from source: set_fact 27885 1726882557.65201: variable 'omit' from source: magic vars 27885 1726882557.65232: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882557.65258: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882557.65279: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882557.65297: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882557.65313: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882557.65337: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882557.65340: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882557.65343: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882557.65418: Set connection var ansible_pipelining to False 27885 1726882557.65421: Set connection var ansible_connection to ssh 27885 1726882557.65427: Set connection var ansible_timeout to 10 27885 1726882557.65430: Set connection var ansible_shell_type to sh 27885 1726882557.65434: Set connection var ansible_shell_executable to /bin/sh 27885 1726882557.65440: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882557.65457: variable 'ansible_shell_executable' from source: unknown 27885 1726882557.65461: variable 'ansible_connection' from source: unknown 27885 1726882557.65464: variable 'ansible_module_compression' from source: unknown 27885 1726882557.65466: variable 'ansible_shell_type' from source: unknown 27885 1726882557.65470: variable 'ansible_shell_executable' from source: unknown 27885 1726882557.65472: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882557.65475: variable 'ansible_pipelining' from source: unknown 27885 1726882557.65479: variable 'ansible_timeout' from source: unknown 27885 1726882557.65481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882557.65579: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882557.65587: variable 'omit' from source: magic vars 27885 1726882557.65598: starting attempt loop 27885 1726882557.65601: running the handler 27885 1726882557.65634: handler run complete 27885 1726882557.65645: attempt loop complete, returning result 27885 1726882557.65648: _execute() done 27885 1726882557.65651: dumping result to json 27885 1726882557.65653: done dumping result, returning 27885 1726882557.65660: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-3fa5-01be-000000000642] 27885 1726882557.65665: sending task result for task 12673a56-9f93-3fa5-01be-000000000642 27885 1726882557.65745: done sending task result for task 12673a56-9f93-3fa5-01be-000000000642 27885 1726882557.65748: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 27885 1726882557.65809: no more pending results, returning what we have 27885 1726882557.65813: results queue empty 27885 1726882557.65814: checking for any_errors_fatal 27885 1726882557.65823: done checking for any_errors_fatal 27885 1726882557.65823: checking for max_fail_percentage 27885 1726882557.65825: done checking for max_fail_percentage 27885 1726882557.65825: checking to see if all hosts have failed and the running result is not ok 27885 1726882557.65826: done checking to see if all hosts have failed 27885 1726882557.65827: getting the remaining hosts for this loop 27885 1726882557.65829: done getting the remaining hosts for this loop 27885 1726882557.65831: getting the next task for host managed_node2 27885 1726882557.65839: done getting next task for host managed_node2 27885 1726882557.65842: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 27885 1726882557.65846: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882557.65856: getting variables 27885 1726882557.65857: in VariableManager get_vars() 27885 1726882557.65894: Calling all_inventory to load vars for managed_node2 27885 1726882557.65897: Calling groups_inventory to load vars for managed_node2 27885 1726882557.65899: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882557.65906: Calling all_plugins_play to load vars for managed_node2 27885 1726882557.65909: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882557.65911: Calling groups_plugins_play to load vars for managed_node2 27885 1726882557.66647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882557.67508: done with get_vars() 27885 1726882557.67523: done getting variables 27885 1726882557.67561: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:35:57 -0400 (0:00:00.034) 0:00:30.318 ****** 27885 1726882557.67583: entering _queue_task() for managed_node2/fail 27885 1726882557.67802: worker is 1 (out of 1 available) 27885 1726882557.67814: exiting _queue_task() for managed_node2/fail 27885 1726882557.67825: done queuing things up, now waiting for results queue to drain 27885 1726882557.67827: waiting for pending results... 27885 1726882557.67992: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 27885 1726882557.68078: in run() - task 12673a56-9f93-3fa5-01be-000000000643 27885 1726882557.68095: variable 'ansible_search_path' from source: unknown 27885 1726882557.68099: variable 'ansible_search_path' from source: unknown 27885 1726882557.68124: calling self._execute() 27885 1726882557.68196: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882557.68206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882557.68215: variable 'omit' from source: magic vars 27885 1726882557.68488: variable 'ansible_distribution_major_version' from source: facts 27885 1726882557.68503: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882557.68579: variable 'network_state' from source: role '' defaults 27885 1726882557.68588: Evaluated conditional (network_state != {}): False 27885 1726882557.68591: when evaluation is False, skipping this task 27885 1726882557.68600: _execute() done 27885 1726882557.68605: dumping result to json 27885 1726882557.68608: done dumping result, returning 27885 1726882557.68616: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-3fa5-01be-000000000643] 27885 1726882557.68620: sending task result for task 12673a56-9f93-3fa5-01be-000000000643 27885 1726882557.68699: done sending task result for task 12673a56-9f93-3fa5-01be-000000000643 27885 1726882557.68703: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 27885 1726882557.68762: no more pending results, returning what we have 27885 1726882557.68765: results queue empty 27885 1726882557.68766: checking for any_errors_fatal 27885 1726882557.68772: done checking for any_errors_fatal 27885 1726882557.68772: checking for max_fail_percentage 27885 1726882557.68774: done checking for max_fail_percentage 27885 1726882557.68774: checking to see if all hosts have failed and the running result is not ok 27885 1726882557.68775: done checking to see if all hosts have failed 27885 1726882557.68776: getting the remaining hosts for this loop 27885 1726882557.68777: done getting the remaining hosts for this loop 27885 1726882557.68780: getting the next task for host managed_node2 27885 1726882557.68785: done getting next task for host managed_node2 27885 1726882557.68788: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 27885 1726882557.68792: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882557.68810: getting variables 27885 1726882557.68813: in VariableManager get_vars() 27885 1726882557.68847: Calling all_inventory to load vars for managed_node2 27885 1726882557.68850: Calling groups_inventory to load vars for managed_node2 27885 1726882557.68852: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882557.68859: Calling all_plugins_play to load vars for managed_node2 27885 1726882557.68861: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882557.68864: Calling groups_plugins_play to load vars for managed_node2 27885 1726882557.69676: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882557.70536: done with get_vars() 27885 1726882557.70551: done getting variables 27885 1726882557.70590: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:35:57 -0400 (0:00:00.030) 0:00:30.348 ****** 27885 1726882557.70616: entering _queue_task() for managed_node2/fail 27885 1726882557.70823: worker is 1 (out of 1 available) 27885 1726882557.70838: exiting _queue_task() for managed_node2/fail 27885 1726882557.70850: done queuing things up, now waiting for results queue to drain 27885 1726882557.70851: waiting for pending results... 27885 1726882557.71024: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 27885 1726882557.71116: in run() - task 12673a56-9f93-3fa5-01be-000000000644 27885 1726882557.71128: variable 'ansible_search_path' from source: unknown 27885 1726882557.71131: variable 'ansible_search_path' from source: unknown 27885 1726882557.71158: calling self._execute() 27885 1726882557.71233: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882557.71237: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882557.71245: variable 'omit' from source: magic vars 27885 1726882557.71513: variable 'ansible_distribution_major_version' from source: facts 27885 1726882557.71524: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882557.71602: variable 'network_state' from source: role '' defaults 27885 1726882557.71610: Evaluated conditional (network_state != {}): False 27885 1726882557.71614: when evaluation is False, skipping this task 27885 1726882557.71617: _execute() done 27885 1726882557.71620: dumping result to json 27885 1726882557.71625: done dumping result, returning 27885 1726882557.71636: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-3fa5-01be-000000000644] 27885 1726882557.71639: sending task result for task 12673a56-9f93-3fa5-01be-000000000644 27885 1726882557.71721: done sending task result for task 12673a56-9f93-3fa5-01be-000000000644 27885 1726882557.71724: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 27885 1726882557.71782: no more pending results, returning what we have 27885 1726882557.71785: results queue empty 27885 1726882557.71786: checking for any_errors_fatal 27885 1726882557.71791: done checking for any_errors_fatal 27885 1726882557.71792: checking for max_fail_percentage 27885 1726882557.71796: done checking for max_fail_percentage 27885 1726882557.71796: checking to see if all hosts have failed and the running result is not ok 27885 1726882557.71797: done checking to see if all hosts have failed 27885 1726882557.71798: getting the remaining hosts for this loop 27885 1726882557.71800: done getting the remaining hosts for this loop 27885 1726882557.71803: getting the next task for host managed_node2 27885 1726882557.71809: done getting next task for host managed_node2 27885 1726882557.71812: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 27885 1726882557.71816: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882557.71833: getting variables 27885 1726882557.71834: in VariableManager get_vars() 27885 1726882557.71869: Calling all_inventory to load vars for managed_node2 27885 1726882557.71871: Calling groups_inventory to load vars for managed_node2 27885 1726882557.71873: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882557.71881: Calling all_plugins_play to load vars for managed_node2 27885 1726882557.71883: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882557.71886: Calling groups_plugins_play to load vars for managed_node2 27885 1726882557.72621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882557.73596: done with get_vars() 27885 1726882557.73613: done getting variables 27885 1726882557.73658: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:35:57 -0400 (0:00:00.030) 0:00:30.379 ****** 27885 1726882557.73682: entering _queue_task() for managed_node2/fail 27885 1726882557.73929: worker is 1 (out of 1 available) 27885 1726882557.73943: exiting _queue_task() for managed_node2/fail 27885 1726882557.73957: done queuing things up, now waiting for results queue to drain 27885 1726882557.73958: waiting for pending results... 27885 1726882557.74139: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 27885 1726882557.74231: in run() - task 12673a56-9f93-3fa5-01be-000000000645 27885 1726882557.74244: variable 'ansible_search_path' from source: unknown 27885 1726882557.74247: variable 'ansible_search_path' from source: unknown 27885 1726882557.74275: calling self._execute() 27885 1726882557.74355: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882557.74358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882557.74366: variable 'omit' from source: magic vars 27885 1726882557.74646: variable 'ansible_distribution_major_version' from source: facts 27885 1726882557.74657: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882557.74781: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27885 1726882557.76285: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27885 1726882557.76343: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27885 1726882557.76372: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27885 1726882557.76399: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27885 1726882557.76422: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27885 1726882557.76479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882557.76508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882557.76526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882557.76551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882557.76562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882557.76637: variable 'ansible_distribution_major_version' from source: facts 27885 1726882557.76649: Evaluated conditional (ansible_distribution_major_version | int > 9): True 27885 1726882557.76732: variable 'ansible_distribution' from source: facts 27885 1726882557.76735: variable '__network_rh_distros' from source: role '' defaults 27885 1726882557.76743: Evaluated conditional (ansible_distribution in __network_rh_distros): True 27885 1726882557.76901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882557.76922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882557.76938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882557.76963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882557.76973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882557.77009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882557.77028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882557.77044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882557.77068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882557.77078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882557.77110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882557.77129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882557.77146: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882557.77169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882557.77179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882557.77373: variable 'network_connections' from source: include params 27885 1726882557.77383: variable 'interface0' from source: play vars 27885 1726882557.77437: variable 'interface0' from source: play vars 27885 1726882557.77447: variable 'interface1' from source: play vars 27885 1726882557.77490: variable 'interface1' from source: play vars 27885 1726882557.77501: variable 'network_state' from source: role '' defaults 27885 1726882557.77546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27885 1726882557.77658: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27885 1726882557.77688: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27885 1726882557.77715: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27885 1726882557.77746: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27885 1726882557.77782: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27885 1726882557.77801: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27885 1726882557.77819: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882557.77837: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27885 1726882557.77858: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 27885 1726882557.77861: when evaluation is False, skipping this task 27885 1726882557.77865: _execute() done 27885 1726882557.77867: dumping result to json 27885 1726882557.77870: done dumping result, returning 27885 1726882557.77877: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-3fa5-01be-000000000645] 27885 1726882557.77882: sending task result for task 12673a56-9f93-3fa5-01be-000000000645 27885 1726882557.77973: done sending task result for task 12673a56-9f93-3fa5-01be-000000000645 27885 1726882557.77976: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 27885 1726882557.78037: no more pending results, returning what we have 27885 1726882557.78041: results queue empty 27885 1726882557.78042: checking for any_errors_fatal 27885 1726882557.78048: done checking for any_errors_fatal 27885 1726882557.78049: checking for max_fail_percentage 27885 1726882557.78051: done checking for max_fail_percentage 27885 1726882557.78051: checking to see if all hosts have failed and the running result is not ok 27885 1726882557.78052: done checking to see if all hosts have failed 27885 1726882557.78053: getting the remaining hosts for this loop 27885 1726882557.78054: done getting the remaining hosts for this loop 27885 1726882557.78058: getting the next task for host managed_node2 27885 1726882557.78066: done getting next task for host managed_node2 27885 1726882557.78069: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 27885 1726882557.78074: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882557.78095: getting variables 27885 1726882557.78097: in VariableManager get_vars() 27885 1726882557.78138: Calling all_inventory to load vars for managed_node2 27885 1726882557.78146: Calling groups_inventory to load vars for managed_node2 27885 1726882557.78148: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882557.78157: Calling all_plugins_play to load vars for managed_node2 27885 1726882557.78159: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882557.78162: Calling groups_plugins_play to load vars for managed_node2 27885 1726882557.78987: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882557.79858: done with get_vars() 27885 1726882557.79875: done getting variables 27885 1726882557.79921: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:35:57 -0400 (0:00:00.062) 0:00:30.441 ****** 27885 1726882557.79947: entering _queue_task() for managed_node2/dnf 27885 1726882557.80198: worker is 1 (out of 1 available) 27885 1726882557.80212: exiting _queue_task() for managed_node2/dnf 27885 1726882557.80225: done queuing things up, now waiting for results queue to drain 27885 1726882557.80227: waiting for pending results... 27885 1726882557.80412: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 27885 1726882557.80499: in run() - task 12673a56-9f93-3fa5-01be-000000000646 27885 1726882557.80512: variable 'ansible_search_path' from source: unknown 27885 1726882557.80517: variable 'ansible_search_path' from source: unknown 27885 1726882557.80544: calling self._execute() 27885 1726882557.80622: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882557.80625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882557.80635: variable 'omit' from source: magic vars 27885 1726882557.80903: variable 'ansible_distribution_major_version' from source: facts 27885 1726882557.80912: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882557.81059: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27885 1726882557.82604: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27885 1726882557.82656: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27885 1726882557.82682: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27885 1726882557.82709: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27885 1726882557.82729: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27885 1726882557.82788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882557.82811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882557.82829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882557.82854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882557.82865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882557.82944: variable 'ansible_distribution' from source: facts 27885 1726882557.82948: variable 'ansible_distribution_major_version' from source: facts 27885 1726882557.82959: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 27885 1726882557.83036: variable '__network_wireless_connections_defined' from source: role '' defaults 27885 1726882557.83119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882557.83135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882557.83151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882557.83176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882557.83186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882557.83218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882557.83234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882557.83250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882557.83272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882557.83282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882557.83313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882557.83331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882557.83346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882557.83369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882557.83379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882557.83479: variable 'network_connections' from source: include params 27885 1726882557.83492: variable 'interface0' from source: play vars 27885 1726882557.83542: variable 'interface0' from source: play vars 27885 1726882557.83549: variable 'interface1' from source: play vars 27885 1726882557.83592: variable 'interface1' from source: play vars 27885 1726882557.83639: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27885 1726882557.83745: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27885 1726882557.83773: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27885 1726882557.83797: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27885 1726882557.83819: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27885 1726882557.83862: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27885 1726882557.83878: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27885 1726882557.83902: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882557.83920: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27885 1726882557.83957: variable '__network_team_connections_defined' from source: role '' defaults 27885 1726882557.84398: variable 'network_connections' from source: include params 27885 1726882557.84402: variable 'interface0' from source: play vars 27885 1726882557.84404: variable 'interface0' from source: play vars 27885 1726882557.84406: variable 'interface1' from source: play vars 27885 1726882557.84408: variable 'interface1' from source: play vars 27885 1726882557.84409: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 27885 1726882557.84411: when evaluation is False, skipping this task 27885 1726882557.84413: _execute() done 27885 1726882557.84414: dumping result to json 27885 1726882557.84417: done dumping result, returning 27885 1726882557.84419: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-3fa5-01be-000000000646] 27885 1726882557.84431: sending task result for task 12673a56-9f93-3fa5-01be-000000000646 27885 1726882557.84709: done sending task result for task 12673a56-9f93-3fa5-01be-000000000646 27885 1726882557.84712: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 27885 1726882557.84764: no more pending results, returning what we have 27885 1726882557.84767: results queue empty 27885 1726882557.84768: checking for any_errors_fatal 27885 1726882557.84776: done checking for any_errors_fatal 27885 1726882557.84777: checking for max_fail_percentage 27885 1726882557.84779: done checking for max_fail_percentage 27885 1726882557.84780: checking to see if all hosts have failed and the running result is not ok 27885 1726882557.84780: done checking to see if all hosts have failed 27885 1726882557.84781: getting the remaining hosts for this loop 27885 1726882557.84783: done getting the remaining hosts for this loop 27885 1726882557.84787: getting the next task for host managed_node2 27885 1726882557.84799: done getting next task for host managed_node2 27885 1726882557.84804: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 27885 1726882557.84808: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882557.84834: getting variables 27885 1726882557.84836: in VariableManager get_vars() 27885 1726882557.84878: Calling all_inventory to load vars for managed_node2 27885 1726882557.84881: Calling groups_inventory to load vars for managed_node2 27885 1726882557.84884: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882557.84901: Calling all_plugins_play to load vars for managed_node2 27885 1726882557.84905: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882557.84909: Calling groups_plugins_play to load vars for managed_node2 27885 1726882557.86262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882557.91570: done with get_vars() 27885 1726882557.91607: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 27885 1726882557.91669: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:35:57 -0400 (0:00:00.117) 0:00:30.559 ****** 27885 1726882557.91702: entering _queue_task() for managed_node2/yum 27885 1726882557.92057: worker is 1 (out of 1 available) 27885 1726882557.92070: exiting _queue_task() for managed_node2/yum 27885 1726882557.92083: done queuing things up, now waiting for results queue to drain 27885 1726882557.92084: waiting for pending results... 27885 1726882557.92514: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 27885 1726882557.92526: in run() - task 12673a56-9f93-3fa5-01be-000000000647 27885 1726882557.92547: variable 'ansible_search_path' from source: unknown 27885 1726882557.92555: variable 'ansible_search_path' from source: unknown 27885 1726882557.92599: calling self._execute() 27885 1726882557.92710: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882557.92723: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882557.92738: variable 'omit' from source: magic vars 27885 1726882557.93136: variable 'ansible_distribution_major_version' from source: facts 27885 1726882557.93153: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882557.93355: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27885 1726882557.95641: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27885 1726882557.95792: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27885 1726882557.95797: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27885 1726882557.95819: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27885 1726882557.95855: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27885 1726882557.95947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882557.95982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882557.96027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882557.96074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882557.96099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882557.96224: variable 'ansible_distribution_major_version' from source: facts 27885 1726882557.96229: Evaluated conditional (ansible_distribution_major_version | int < 8): False 27885 1726882557.96237: when evaluation is False, skipping this task 27885 1726882557.96244: _execute() done 27885 1726882557.96398: dumping result to json 27885 1726882557.96402: done dumping result, returning 27885 1726882557.96405: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-3fa5-01be-000000000647] 27885 1726882557.96407: sending task result for task 12673a56-9f93-3fa5-01be-000000000647 27885 1726882557.96480: done sending task result for task 12673a56-9f93-3fa5-01be-000000000647 27885 1726882557.96483: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 27885 1726882557.96544: no more pending results, returning what we have 27885 1726882557.96547: results queue empty 27885 1726882557.96548: checking for any_errors_fatal 27885 1726882557.96561: done checking for any_errors_fatal 27885 1726882557.96561: checking for max_fail_percentage 27885 1726882557.96563: done checking for max_fail_percentage 27885 1726882557.96564: checking to see if all hosts have failed and the running result is not ok 27885 1726882557.96565: done checking to see if all hosts have failed 27885 1726882557.96565: getting the remaining hosts for this loop 27885 1726882557.96567: done getting the remaining hosts for this loop 27885 1726882557.96571: getting the next task for host managed_node2 27885 1726882557.96579: done getting next task for host managed_node2 27885 1726882557.96583: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 27885 1726882557.96588: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882557.96615: getting variables 27885 1726882557.96617: in VariableManager get_vars() 27885 1726882557.96665: Calling all_inventory to load vars for managed_node2 27885 1726882557.96668: Calling groups_inventory to load vars for managed_node2 27885 1726882557.96671: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882557.96682: Calling all_plugins_play to load vars for managed_node2 27885 1726882557.96692: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882557.96697: Calling groups_plugins_play to load vars for managed_node2 27885 1726882557.97580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882557.98579: done with get_vars() 27885 1726882557.98606: done getting variables 27885 1726882557.98663: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:35:57 -0400 (0:00:00.069) 0:00:30.629 ****** 27885 1726882557.98705: entering _queue_task() for managed_node2/fail 27885 1726882557.99126: worker is 1 (out of 1 available) 27885 1726882557.99140: exiting _queue_task() for managed_node2/fail 27885 1726882557.99152: done queuing things up, now waiting for results queue to drain 27885 1726882557.99154: waiting for pending results... 27885 1726882557.99380: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 27885 1726882557.99483: in run() - task 12673a56-9f93-3fa5-01be-000000000648 27885 1726882557.99497: variable 'ansible_search_path' from source: unknown 27885 1726882557.99500: variable 'ansible_search_path' from source: unknown 27885 1726882557.99528: calling self._execute() 27885 1726882557.99604: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882557.99611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882557.99621: variable 'omit' from source: magic vars 27885 1726882557.99897: variable 'ansible_distribution_major_version' from source: facts 27885 1726882557.99906: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882557.99988: variable '__network_wireless_connections_defined' from source: role '' defaults 27885 1726882558.00123: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27885 1726882558.01798: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27885 1726882558.02160: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27885 1726882558.02210: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27885 1726882558.02247: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27885 1726882558.02284: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27885 1726882558.02365: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882558.02410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882558.02441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882558.02487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882558.02516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882558.02566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882558.02700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882558.02705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882558.02707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882558.02711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882558.02769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882558.02807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882558.02847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882558.02871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882558.02882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882558.03004: variable 'network_connections' from source: include params 27885 1726882558.03015: variable 'interface0' from source: play vars 27885 1726882558.03068: variable 'interface0' from source: play vars 27885 1726882558.03078: variable 'interface1' from source: play vars 27885 1726882558.03123: variable 'interface1' from source: play vars 27885 1726882558.03172: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27885 1726882558.03280: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27885 1726882558.03310: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27885 1726882558.03332: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27885 1726882558.03362: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27885 1726882558.03394: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27885 1726882558.03415: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27885 1726882558.03432: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882558.03449: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27885 1726882558.03491: variable '__network_team_connections_defined' from source: role '' defaults 27885 1726882558.03631: variable 'network_connections' from source: include params 27885 1726882558.03634: variable 'interface0' from source: play vars 27885 1726882558.03676: variable 'interface0' from source: play vars 27885 1726882558.03682: variable 'interface1' from source: play vars 27885 1726882558.03729: variable 'interface1' from source: play vars 27885 1726882558.03746: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 27885 1726882558.03750: when evaluation is False, skipping this task 27885 1726882558.03753: _execute() done 27885 1726882558.03755: dumping result to json 27885 1726882558.03758: done dumping result, returning 27885 1726882558.03765: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-3fa5-01be-000000000648] 27885 1726882558.03775: sending task result for task 12673a56-9f93-3fa5-01be-000000000648 27885 1726882558.03867: done sending task result for task 12673a56-9f93-3fa5-01be-000000000648 27885 1726882558.03870: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 27885 1726882558.03921: no more pending results, returning what we have 27885 1726882558.03925: results queue empty 27885 1726882558.03926: checking for any_errors_fatal 27885 1726882558.03930: done checking for any_errors_fatal 27885 1726882558.03931: checking for max_fail_percentage 27885 1726882558.03932: done checking for max_fail_percentage 27885 1726882558.03933: checking to see if all hosts have failed and the running result is not ok 27885 1726882558.03933: done checking to see if all hosts have failed 27885 1726882558.03934: getting the remaining hosts for this loop 27885 1726882558.03936: done getting the remaining hosts for this loop 27885 1726882558.03939: getting the next task for host managed_node2 27885 1726882558.03947: done getting next task for host managed_node2 27885 1726882558.03950: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 27885 1726882558.03954: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882558.03972: getting variables 27885 1726882558.03974: in VariableManager get_vars() 27885 1726882558.04019: Calling all_inventory to load vars for managed_node2 27885 1726882558.04022: Calling groups_inventory to load vars for managed_node2 27885 1726882558.04025: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882558.04034: Calling all_plugins_play to load vars for managed_node2 27885 1726882558.04036: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882558.04039: Calling groups_plugins_play to load vars for managed_node2 27885 1726882558.05418: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882558.06829: done with get_vars() 27885 1726882558.06850: done getting variables 27885 1726882558.06917: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:35:58 -0400 (0:00:00.082) 0:00:30.711 ****** 27885 1726882558.06952: entering _queue_task() for managed_node2/package 27885 1726882558.07468: worker is 1 (out of 1 available) 27885 1726882558.07481: exiting _queue_task() for managed_node2/package 27885 1726882558.07699: done queuing things up, now waiting for results queue to drain 27885 1726882558.07701: waiting for pending results... 27885 1726882558.07911: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 27885 1726882558.07943: in run() - task 12673a56-9f93-3fa5-01be-000000000649 27885 1726882558.07965: variable 'ansible_search_path' from source: unknown 27885 1726882558.07975: variable 'ansible_search_path' from source: unknown 27885 1726882558.08023: calling self._execute() 27885 1726882558.08145: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882558.08160: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882558.08253: variable 'omit' from source: magic vars 27885 1726882558.08707: variable 'ansible_distribution_major_version' from source: facts 27885 1726882558.08725: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882558.08948: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27885 1726882558.09514: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27885 1726882558.09610: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27885 1726882558.09613: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27885 1726882558.09659: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27885 1726882558.09968: variable 'network_packages' from source: role '' defaults 27885 1726882558.10200: variable '__network_provider_setup' from source: role '' defaults 27885 1726882558.10229: variable '__network_service_name_default_nm' from source: role '' defaults 27885 1726882558.10312: variable '__network_service_name_default_nm' from source: role '' defaults 27885 1726882558.10492: variable '__network_packages_default_nm' from source: role '' defaults 27885 1726882558.10561: variable '__network_packages_default_nm' from source: role '' defaults 27885 1726882558.10901: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27885 1726882558.12475: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27885 1726882558.12520: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27885 1726882558.12549: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27885 1726882558.12572: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27885 1726882558.12591: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27885 1726882558.12662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882558.12683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882558.12705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882558.12730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882558.12741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882558.12774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882558.12791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882558.12811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882558.12836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882558.12846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882558.12986: variable '__network_packages_default_gobject_packages' from source: role '' defaults 27885 1726882558.13058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882558.13074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882558.13098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882558.13122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882558.13133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882558.13197: variable 'ansible_python' from source: facts 27885 1726882558.13214: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 27885 1726882558.13268: variable '__network_wpa_supplicant_required' from source: role '' defaults 27885 1726882558.13326: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 27885 1726882558.13411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882558.13429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882558.13445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882558.13469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882558.13478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882558.13519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882558.13535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882558.13552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882558.13575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882558.13586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882558.13683: variable 'network_connections' from source: include params 27885 1726882558.13689: variable 'interface0' from source: play vars 27885 1726882558.13763: variable 'interface0' from source: play vars 27885 1726882558.13773: variable 'interface1' from source: play vars 27885 1726882558.13846: variable 'interface1' from source: play vars 27885 1726882558.13895: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27885 1726882558.13916: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27885 1726882558.13936: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882558.13959: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27885 1726882558.13999: variable '__network_wireless_connections_defined' from source: role '' defaults 27885 1726882558.14175: variable 'network_connections' from source: include params 27885 1726882558.14178: variable 'interface0' from source: play vars 27885 1726882558.14251: variable 'interface0' from source: play vars 27885 1726882558.14259: variable 'interface1' from source: play vars 27885 1726882558.14333: variable 'interface1' from source: play vars 27885 1726882558.14356: variable '__network_packages_default_wireless' from source: role '' defaults 27885 1726882558.14416: variable '__network_wireless_connections_defined' from source: role '' defaults 27885 1726882558.14610: variable 'network_connections' from source: include params 27885 1726882558.14614: variable 'interface0' from source: play vars 27885 1726882558.14658: variable 'interface0' from source: play vars 27885 1726882558.14664: variable 'interface1' from source: play vars 27885 1726882558.14713: variable 'interface1' from source: play vars 27885 1726882558.14731: variable '__network_packages_default_team' from source: role '' defaults 27885 1726882558.14783: variable '__network_team_connections_defined' from source: role '' defaults 27885 1726882558.14980: variable 'network_connections' from source: include params 27885 1726882558.14984: variable 'interface0' from source: play vars 27885 1726882558.15034: variable 'interface0' from source: play vars 27885 1726882558.15040: variable 'interface1' from source: play vars 27885 1726882558.15085: variable 'interface1' from source: play vars 27885 1726882558.15131: variable '__network_service_name_default_initscripts' from source: role '' defaults 27885 1726882558.15176: variable '__network_service_name_default_initscripts' from source: role '' defaults 27885 1726882558.15182: variable '__network_packages_default_initscripts' from source: role '' defaults 27885 1726882558.15227: variable '__network_packages_default_initscripts' from source: role '' defaults 27885 1726882558.15364: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 27885 1726882558.15663: variable 'network_connections' from source: include params 27885 1726882558.15667: variable 'interface0' from source: play vars 27885 1726882558.15716: variable 'interface0' from source: play vars 27885 1726882558.15723: variable 'interface1' from source: play vars 27885 1726882558.15763: variable 'interface1' from source: play vars 27885 1726882558.15769: variable 'ansible_distribution' from source: facts 27885 1726882558.15772: variable '__network_rh_distros' from source: role '' defaults 27885 1726882558.15779: variable 'ansible_distribution_major_version' from source: facts 27885 1726882558.15794: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 27885 1726882558.15901: variable 'ansible_distribution' from source: facts 27885 1726882558.15904: variable '__network_rh_distros' from source: role '' defaults 27885 1726882558.15907: variable 'ansible_distribution_major_version' from source: facts 27885 1726882558.15922: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 27885 1726882558.16042: variable 'ansible_distribution' from source: facts 27885 1726882558.16045: variable '__network_rh_distros' from source: role '' defaults 27885 1726882558.16050: variable 'ansible_distribution_major_version' from source: facts 27885 1726882558.16075: variable 'network_provider' from source: set_fact 27885 1726882558.16085: variable 'ansible_facts' from source: unknown 27885 1726882558.16523: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 27885 1726882558.16527: when evaluation is False, skipping this task 27885 1726882558.16529: _execute() done 27885 1726882558.16532: dumping result to json 27885 1726882558.16534: done dumping result, returning 27885 1726882558.16542: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-3fa5-01be-000000000649] 27885 1726882558.16546: sending task result for task 12673a56-9f93-3fa5-01be-000000000649 27885 1726882558.16637: done sending task result for task 12673a56-9f93-3fa5-01be-000000000649 27885 1726882558.16641: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 27885 1726882558.16709: no more pending results, returning what we have 27885 1726882558.16713: results queue empty 27885 1726882558.16714: checking for any_errors_fatal 27885 1726882558.16721: done checking for any_errors_fatal 27885 1726882558.16722: checking for max_fail_percentage 27885 1726882558.16723: done checking for max_fail_percentage 27885 1726882558.16724: checking to see if all hosts have failed and the running result is not ok 27885 1726882558.16725: done checking to see if all hosts have failed 27885 1726882558.16725: getting the remaining hosts for this loop 27885 1726882558.16727: done getting the remaining hosts for this loop 27885 1726882558.16731: getting the next task for host managed_node2 27885 1726882558.16738: done getting next task for host managed_node2 27885 1726882558.16741: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 27885 1726882558.16746: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882558.16771: getting variables 27885 1726882558.16772: in VariableManager get_vars() 27885 1726882558.16822: Calling all_inventory to load vars for managed_node2 27885 1726882558.16825: Calling groups_inventory to load vars for managed_node2 27885 1726882558.16827: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882558.16835: Calling all_plugins_play to load vars for managed_node2 27885 1726882558.16838: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882558.16840: Calling groups_plugins_play to load vars for managed_node2 27885 1726882558.17638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882558.18528: done with get_vars() 27885 1726882558.18544: done getting variables 27885 1726882558.18588: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:35:58 -0400 (0:00:00.116) 0:00:30.828 ****** 27885 1726882558.18614: entering _queue_task() for managed_node2/package 27885 1726882558.18846: worker is 1 (out of 1 available) 27885 1726882558.18860: exiting _queue_task() for managed_node2/package 27885 1726882558.18872: done queuing things up, now waiting for results queue to drain 27885 1726882558.18874: waiting for pending results... 27885 1726882558.19052: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 27885 1726882558.19150: in run() - task 12673a56-9f93-3fa5-01be-00000000064a 27885 1726882558.19161: variable 'ansible_search_path' from source: unknown 27885 1726882558.19163: variable 'ansible_search_path' from source: unknown 27885 1726882558.19192: calling self._execute() 27885 1726882558.19272: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882558.19275: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882558.19284: variable 'omit' from source: magic vars 27885 1726882558.19559: variable 'ansible_distribution_major_version' from source: facts 27885 1726882558.19569: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882558.19654: variable 'network_state' from source: role '' defaults 27885 1726882558.19663: Evaluated conditional (network_state != {}): False 27885 1726882558.19667: when evaluation is False, skipping this task 27885 1726882558.19669: _execute() done 27885 1726882558.19672: dumping result to json 27885 1726882558.19674: done dumping result, returning 27885 1726882558.19682: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-3fa5-01be-00000000064a] 27885 1726882558.19688: sending task result for task 12673a56-9f93-3fa5-01be-00000000064a 27885 1726882558.19780: done sending task result for task 12673a56-9f93-3fa5-01be-00000000064a 27885 1726882558.19782: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 27885 1726882558.19832: no more pending results, returning what we have 27885 1726882558.19836: results queue empty 27885 1726882558.19836: checking for any_errors_fatal 27885 1726882558.19843: done checking for any_errors_fatal 27885 1726882558.19843: checking for max_fail_percentage 27885 1726882558.19845: done checking for max_fail_percentage 27885 1726882558.19846: checking to see if all hosts have failed and the running result is not ok 27885 1726882558.19846: done checking to see if all hosts have failed 27885 1726882558.19847: getting the remaining hosts for this loop 27885 1726882558.19849: done getting the remaining hosts for this loop 27885 1726882558.19852: getting the next task for host managed_node2 27885 1726882558.19860: done getting next task for host managed_node2 27885 1726882558.19863: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 27885 1726882558.19868: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882558.19887: getting variables 27885 1726882558.19889: in VariableManager get_vars() 27885 1726882558.19926: Calling all_inventory to load vars for managed_node2 27885 1726882558.19929: Calling groups_inventory to load vars for managed_node2 27885 1726882558.19931: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882558.19939: Calling all_plugins_play to load vars for managed_node2 27885 1726882558.19941: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882558.19944: Calling groups_plugins_play to load vars for managed_node2 27885 1726882558.20836: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882558.21703: done with get_vars() 27885 1726882558.21719: done getting variables 27885 1726882558.21763: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:35:58 -0400 (0:00:00.031) 0:00:30.860 ****** 27885 1726882558.21787: entering _queue_task() for managed_node2/package 27885 1726882558.22036: worker is 1 (out of 1 available) 27885 1726882558.22051: exiting _queue_task() for managed_node2/package 27885 1726882558.22063: done queuing things up, now waiting for results queue to drain 27885 1726882558.22064: waiting for pending results... 27885 1726882558.22242: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 27885 1726882558.22341: in run() - task 12673a56-9f93-3fa5-01be-00000000064b 27885 1726882558.22351: variable 'ansible_search_path' from source: unknown 27885 1726882558.22355: variable 'ansible_search_path' from source: unknown 27885 1726882558.22384: calling self._execute() 27885 1726882558.22464: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882558.22467: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882558.22476: variable 'omit' from source: magic vars 27885 1726882558.22755: variable 'ansible_distribution_major_version' from source: facts 27885 1726882558.22765: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882558.22849: variable 'network_state' from source: role '' defaults 27885 1726882558.22858: Evaluated conditional (network_state != {}): False 27885 1726882558.22861: when evaluation is False, skipping this task 27885 1726882558.22864: _execute() done 27885 1726882558.22866: dumping result to json 27885 1726882558.22871: done dumping result, returning 27885 1726882558.22878: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-3fa5-01be-00000000064b] 27885 1726882558.22884: sending task result for task 12673a56-9f93-3fa5-01be-00000000064b 27885 1726882558.22973: done sending task result for task 12673a56-9f93-3fa5-01be-00000000064b 27885 1726882558.22975: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 27885 1726882558.23020: no more pending results, returning what we have 27885 1726882558.23024: results queue empty 27885 1726882558.23025: checking for any_errors_fatal 27885 1726882558.23031: done checking for any_errors_fatal 27885 1726882558.23032: checking for max_fail_percentage 27885 1726882558.23034: done checking for max_fail_percentage 27885 1726882558.23034: checking to see if all hosts have failed and the running result is not ok 27885 1726882558.23035: done checking to see if all hosts have failed 27885 1726882558.23036: getting the remaining hosts for this loop 27885 1726882558.23037: done getting the remaining hosts for this loop 27885 1726882558.23040: getting the next task for host managed_node2 27885 1726882558.23048: done getting next task for host managed_node2 27885 1726882558.23051: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 27885 1726882558.23056: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882558.23075: getting variables 27885 1726882558.23077: in VariableManager get_vars() 27885 1726882558.23114: Calling all_inventory to load vars for managed_node2 27885 1726882558.23117: Calling groups_inventory to load vars for managed_node2 27885 1726882558.23118: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882558.23127: Calling all_plugins_play to load vars for managed_node2 27885 1726882558.23129: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882558.23131: Calling groups_plugins_play to load vars for managed_node2 27885 1726882558.23896: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882558.24878: done with get_vars() 27885 1726882558.24897: done getting variables 27885 1726882558.24941: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:35:58 -0400 (0:00:00.031) 0:00:30.892 ****** 27885 1726882558.24963: entering _queue_task() for managed_node2/service 27885 1726882558.25199: worker is 1 (out of 1 available) 27885 1726882558.25213: exiting _queue_task() for managed_node2/service 27885 1726882558.25226: done queuing things up, now waiting for results queue to drain 27885 1726882558.25227: waiting for pending results... 27885 1726882558.25404: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 27885 1726882558.25498: in run() - task 12673a56-9f93-3fa5-01be-00000000064c 27885 1726882558.25511: variable 'ansible_search_path' from source: unknown 27885 1726882558.25515: variable 'ansible_search_path' from source: unknown 27885 1726882558.25543: calling self._execute() 27885 1726882558.25619: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882558.25622: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882558.25631: variable 'omit' from source: magic vars 27885 1726882558.25898: variable 'ansible_distribution_major_version' from source: facts 27885 1726882558.25907: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882558.25986: variable '__network_wireless_connections_defined' from source: role '' defaults 27885 1726882558.26127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27885 1726882558.27585: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27885 1726882558.27643: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27885 1726882558.27671: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27885 1726882558.27697: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27885 1726882558.27717: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27885 1726882558.27777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882558.27800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882558.27818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882558.27845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882558.27856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882558.27892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882558.27908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882558.27924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882558.27949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882558.27963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882558.27997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882558.28015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882558.28032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882558.28060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882558.28070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882558.28185: variable 'network_connections' from source: include params 27885 1726882558.28200: variable 'interface0' from source: play vars 27885 1726882558.28251: variable 'interface0' from source: play vars 27885 1726882558.28260: variable 'interface1' from source: play vars 27885 1726882558.28311: variable 'interface1' from source: play vars 27885 1726882558.28356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27885 1726882558.28474: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27885 1726882558.28508: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27885 1726882558.28532: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27885 1726882558.28553: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27885 1726882558.28581: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27885 1726882558.28602: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27885 1726882558.28620: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882558.28638: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27885 1726882558.28675: variable '__network_team_connections_defined' from source: role '' defaults 27885 1726882558.28830: variable 'network_connections' from source: include params 27885 1726882558.28834: variable 'interface0' from source: play vars 27885 1726882558.28877: variable 'interface0' from source: play vars 27885 1726882558.28883: variable 'interface1' from source: play vars 27885 1726882558.28929: variable 'interface1' from source: play vars 27885 1726882558.28948: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 27885 1726882558.28951: when evaluation is False, skipping this task 27885 1726882558.28954: _execute() done 27885 1726882558.28958: dumping result to json 27885 1726882558.28960: done dumping result, returning 27885 1726882558.28968: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-3fa5-01be-00000000064c] 27885 1726882558.28978: sending task result for task 12673a56-9f93-3fa5-01be-00000000064c 27885 1726882558.29062: done sending task result for task 12673a56-9f93-3fa5-01be-00000000064c 27885 1726882558.29064: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 27885 1726882558.29112: no more pending results, returning what we have 27885 1726882558.29115: results queue empty 27885 1726882558.29116: checking for any_errors_fatal 27885 1726882558.29122: done checking for any_errors_fatal 27885 1726882558.29123: checking for max_fail_percentage 27885 1726882558.29124: done checking for max_fail_percentage 27885 1726882558.29125: checking to see if all hosts have failed and the running result is not ok 27885 1726882558.29125: done checking to see if all hosts have failed 27885 1726882558.29126: getting the remaining hosts for this loop 27885 1726882558.29128: done getting the remaining hosts for this loop 27885 1726882558.29131: getting the next task for host managed_node2 27885 1726882558.29138: done getting next task for host managed_node2 27885 1726882558.29141: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 27885 1726882558.29145: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882558.29165: getting variables 27885 1726882558.29166: in VariableManager get_vars() 27885 1726882558.29214: Calling all_inventory to load vars for managed_node2 27885 1726882558.29217: Calling groups_inventory to load vars for managed_node2 27885 1726882558.29220: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882558.29228: Calling all_plugins_play to load vars for managed_node2 27885 1726882558.29231: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882558.29233: Calling groups_plugins_play to load vars for managed_node2 27885 1726882558.30026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882558.30898: done with get_vars() 27885 1726882558.30914: done getting variables 27885 1726882558.30953: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:35:58 -0400 (0:00:00.060) 0:00:30.952 ****** 27885 1726882558.30974: entering _queue_task() for managed_node2/service 27885 1726882558.31200: worker is 1 (out of 1 available) 27885 1726882558.31212: exiting _queue_task() for managed_node2/service 27885 1726882558.31225: done queuing things up, now waiting for results queue to drain 27885 1726882558.31226: waiting for pending results... 27885 1726882558.31396: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 27885 1726882558.31488: in run() - task 12673a56-9f93-3fa5-01be-00000000064d 27885 1726882558.31503: variable 'ansible_search_path' from source: unknown 27885 1726882558.31508: variable 'ansible_search_path' from source: unknown 27885 1726882558.31534: calling self._execute() 27885 1726882558.31610: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882558.31616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882558.31624: variable 'omit' from source: magic vars 27885 1726882558.31890: variable 'ansible_distribution_major_version' from source: facts 27885 1726882558.31901: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882558.32004: variable 'network_provider' from source: set_fact 27885 1726882558.32008: variable 'network_state' from source: role '' defaults 27885 1726882558.32019: Evaluated conditional (network_provider == "nm" or network_state != {}): True 27885 1726882558.32024: variable 'omit' from source: magic vars 27885 1726882558.32059: variable 'omit' from source: magic vars 27885 1726882558.32078: variable 'network_service_name' from source: role '' defaults 27885 1726882558.32135: variable 'network_service_name' from source: role '' defaults 27885 1726882558.32206: variable '__network_provider_setup' from source: role '' defaults 27885 1726882558.32211: variable '__network_service_name_default_nm' from source: role '' defaults 27885 1726882558.32257: variable '__network_service_name_default_nm' from source: role '' defaults 27885 1726882558.32264: variable '__network_packages_default_nm' from source: role '' defaults 27885 1726882558.32312: variable '__network_packages_default_nm' from source: role '' defaults 27885 1726882558.32460: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27885 1726882558.33862: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27885 1726882558.33915: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27885 1726882558.33942: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27885 1726882558.33968: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27885 1726882558.33988: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27885 1726882558.34046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882558.34073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882558.34087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882558.34117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882558.34128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882558.34158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882558.34178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882558.34199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882558.34223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882558.34234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882558.34376: variable '__network_packages_default_gobject_packages' from source: role '' defaults 27885 1726882558.34451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882558.34467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882558.34483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882558.34516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882558.34526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882558.34583: variable 'ansible_python' from source: facts 27885 1726882558.34604: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 27885 1726882558.34658: variable '__network_wpa_supplicant_required' from source: role '' defaults 27885 1726882558.34712: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 27885 1726882558.34792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882558.34813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882558.34833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882558.34857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882558.34867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882558.34903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882558.34922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882558.34942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882558.34965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882558.34975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882558.35067: variable 'network_connections' from source: include params 27885 1726882558.35073: variable 'interface0' from source: play vars 27885 1726882558.35127: variable 'interface0' from source: play vars 27885 1726882558.35138: variable 'interface1' from source: play vars 27885 1726882558.35190: variable 'interface1' from source: play vars 27885 1726882558.35262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27885 1726882558.35625: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27885 1726882558.35659: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27885 1726882558.35688: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27885 1726882558.35724: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27885 1726882558.35765: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27885 1726882558.35786: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27885 1726882558.35818: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882558.35839: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27885 1726882558.35875: variable '__network_wireless_connections_defined' from source: role '' defaults 27885 1726882558.36054: variable 'network_connections' from source: include params 27885 1726882558.36060: variable 'interface0' from source: play vars 27885 1726882558.36115: variable 'interface0' from source: play vars 27885 1726882558.36124: variable 'interface1' from source: play vars 27885 1726882558.36177: variable 'interface1' from source: play vars 27885 1726882558.36204: variable '__network_packages_default_wireless' from source: role '' defaults 27885 1726882558.36258: variable '__network_wireless_connections_defined' from source: role '' defaults 27885 1726882558.36438: variable 'network_connections' from source: include params 27885 1726882558.36441: variable 'interface0' from source: play vars 27885 1726882558.36492: variable 'interface0' from source: play vars 27885 1726882558.36502: variable 'interface1' from source: play vars 27885 1726882558.36549: variable 'interface1' from source: play vars 27885 1726882558.36566: variable '__network_packages_default_team' from source: role '' defaults 27885 1726882558.36624: variable '__network_team_connections_defined' from source: role '' defaults 27885 1726882558.36803: variable 'network_connections' from source: include params 27885 1726882558.36807: variable 'interface0' from source: play vars 27885 1726882558.36856: variable 'interface0' from source: play vars 27885 1726882558.36862: variable 'interface1' from source: play vars 27885 1726882558.36915: variable 'interface1' from source: play vars 27885 1726882558.36949: variable '__network_service_name_default_initscripts' from source: role '' defaults 27885 1726882558.36989: variable '__network_service_name_default_initscripts' from source: role '' defaults 27885 1726882558.36999: variable '__network_packages_default_initscripts' from source: role '' defaults 27885 1726882558.37043: variable '__network_packages_default_initscripts' from source: role '' defaults 27885 1726882558.37176: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 27885 1726882558.37481: variable 'network_connections' from source: include params 27885 1726882558.37484: variable 'interface0' from source: play vars 27885 1726882558.37530: variable 'interface0' from source: play vars 27885 1726882558.37536: variable 'interface1' from source: play vars 27885 1726882558.37580: variable 'interface1' from source: play vars 27885 1726882558.37586: variable 'ansible_distribution' from source: facts 27885 1726882558.37588: variable '__network_rh_distros' from source: role '' defaults 27885 1726882558.37599: variable 'ansible_distribution_major_version' from source: facts 27885 1726882558.37610: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 27885 1726882558.37723: variable 'ansible_distribution' from source: facts 27885 1726882558.37726: variable '__network_rh_distros' from source: role '' defaults 27885 1726882558.37731: variable 'ansible_distribution_major_version' from source: facts 27885 1726882558.37741: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 27885 1726882558.37851: variable 'ansible_distribution' from source: facts 27885 1726882558.37855: variable '__network_rh_distros' from source: role '' defaults 27885 1726882558.37859: variable 'ansible_distribution_major_version' from source: facts 27885 1726882558.37886: variable 'network_provider' from source: set_fact 27885 1726882558.37907: variable 'omit' from source: magic vars 27885 1726882558.37926: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882558.37945: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882558.37959: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882558.37971: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882558.37980: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882558.38007: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882558.38010: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882558.38013: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882558.38078: Set connection var ansible_pipelining to False 27885 1726882558.38081: Set connection var ansible_connection to ssh 27885 1726882558.38086: Set connection var ansible_timeout to 10 27885 1726882558.38089: Set connection var ansible_shell_type to sh 27885 1726882558.38104: Set connection var ansible_shell_executable to /bin/sh 27885 1726882558.38106: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882558.38123: variable 'ansible_shell_executable' from source: unknown 27885 1726882558.38125: variable 'ansible_connection' from source: unknown 27885 1726882558.38128: variable 'ansible_module_compression' from source: unknown 27885 1726882558.38130: variable 'ansible_shell_type' from source: unknown 27885 1726882558.38132: variable 'ansible_shell_executable' from source: unknown 27885 1726882558.38139: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882558.38141: variable 'ansible_pipelining' from source: unknown 27885 1726882558.38143: variable 'ansible_timeout' from source: unknown 27885 1726882558.38145: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882558.38214: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882558.38223: variable 'omit' from source: magic vars 27885 1726882558.38229: starting attempt loop 27885 1726882558.38231: running the handler 27885 1726882558.38284: variable 'ansible_facts' from source: unknown 27885 1726882558.38670: _low_level_execute_command(): starting 27885 1726882558.38675: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27885 1726882558.39179: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882558.39185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882558.39190: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882558.39194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 27885 1726882558.39196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882558.39244: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882558.39247: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882558.39249: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882558.39323: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882558.41023: stdout chunk (state=3): >>>/root <<< 27885 1726882558.41124: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882558.41153: stderr chunk (state=3): >>><<< 27885 1726882558.41156: stdout chunk (state=3): >>><<< 27885 1726882558.41173: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882558.41183: _low_level_execute_command(): starting 27885 1726882558.41187: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882558.411725-29298-251987129896929 `" && echo ansible-tmp-1726882558.411725-29298-251987129896929="` echo /root/.ansible/tmp/ansible-tmp-1726882558.411725-29298-251987129896929 `" ) && sleep 0' 27885 1726882558.41620: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882558.41623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882558.41625: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882558.41627: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration <<< 27885 1726882558.41629: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882558.41631: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882558.41682: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882558.41689: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882558.41692: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882558.41745: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882558.43624: stdout chunk (state=3): >>>ansible-tmp-1726882558.411725-29298-251987129896929=/root/.ansible/tmp/ansible-tmp-1726882558.411725-29298-251987129896929 <<< 27885 1726882558.43730: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882558.43756: stderr chunk (state=3): >>><<< 27885 1726882558.43759: stdout chunk (state=3): >>><<< 27885 1726882558.43770: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882558.411725-29298-251987129896929=/root/.ansible/tmp/ansible-tmp-1726882558.411725-29298-251987129896929 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882558.43797: variable 'ansible_module_compression' from source: unknown 27885 1726882558.43835: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27885_1t3g5hz/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 27885 1726882558.43888: variable 'ansible_facts' from source: unknown 27885 1726882558.44027: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882558.411725-29298-251987129896929/AnsiballZ_systemd.py 27885 1726882558.44122: Sending initial data 27885 1726882558.44125: Sent initial data (155 bytes) 27885 1726882558.44559: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882558.44562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 27885 1726882558.44564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882558.44567: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882558.44569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882558.44628: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882558.44631: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882558.44688: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882558.46206: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27885 1726882558.46265: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27885 1726882558.46325: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpexy2a662 /root/.ansible/tmp/ansible-tmp-1726882558.411725-29298-251987129896929/AnsiballZ_systemd.py <<< 27885 1726882558.46329: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882558.411725-29298-251987129896929/AnsiballZ_systemd.py" <<< 27885 1726882558.46385: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpexy2a662" to remote "/root/.ansible/tmp/ansible-tmp-1726882558.411725-29298-251987129896929/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882558.411725-29298-251987129896929/AnsiballZ_systemd.py" <<< 27885 1726882558.47570: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882558.47608: stderr chunk (state=3): >>><<< 27885 1726882558.47611: stdout chunk (state=3): >>><<< 27885 1726882558.47643: done transferring module to remote 27885 1726882558.47656: _low_level_execute_command(): starting 27885 1726882558.47659: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882558.411725-29298-251987129896929/ /root/.ansible/tmp/ansible-tmp-1726882558.411725-29298-251987129896929/AnsiballZ_systemd.py && sleep 0' 27885 1726882558.48068: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882558.48071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 27885 1726882558.48074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 27885 1726882558.48076: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882558.48077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882558.48128: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882558.48131: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882558.48203: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882558.49913: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882558.49937: stderr chunk (state=3): >>><<< 27885 1726882558.49941: stdout chunk (state=3): >>><<< 27885 1726882558.49952: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882558.49955: _low_level_execute_command(): starting 27885 1726882558.49959: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882558.411725-29298-251987129896929/AnsiballZ_systemd.py && sleep 0' 27885 1726882558.50383: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882558.50386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882558.50395: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882558.50397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 27885 1726882558.50399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882558.50443: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882558.50446: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882558.50520: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882558.79207: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6947", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainStartTimestampMonotonic": "260736749", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainHandoffTimestampMonotonic": "260753620", "ExecMainPID": "6947", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4694016", "MemoryPeak": "7507968", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3307094016", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1381666000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.target shutdown.target multi-user.target", "After": "basi<<< 27885 1726882558.79242: stdout chunk (state=3): >>>c.target cloud-init-local.service dbus-broker.service system.slice network-pre.target systemd-journald.socket sysinit.target dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:25 EDT", "StateChangeTimestampMonotonic": "355353338", "InactiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveExitTimestampMonotonic": "260738404", "ActiveEnterTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ActiveEnterTimestampMonotonic": "260824743", "ActiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ActiveExitTimestampMonotonic": "260719627", "InactiveEnterTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveEnterTimestampMonotonic": "260732561", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ConditionTimestampMonotonic": "260735742", "AssertTimestamp": "Fri 2024-09-20 21:27:50 EDT", "AssertTimestampMonotonic": "260735751", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "02f7cf7a90d5486687dc572c7e50e205", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 27885 1726882558.80938: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 27885 1726882558.80942: stdout chunk (state=3): >>><<< 27885 1726882558.80944: stderr chunk (state=3): >>><<< 27885 1726882558.81000: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6947", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainStartTimestampMonotonic": "260736749", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainHandoffTimestampMonotonic": "260753620", "ExecMainPID": "6947", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4694016", "MemoryPeak": "7507968", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3307094016", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1381666000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.target shutdown.target multi-user.target", "After": "basic.target cloud-init-local.service dbus-broker.service system.slice network-pre.target systemd-journald.socket sysinit.target dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:25 EDT", "StateChangeTimestampMonotonic": "355353338", "InactiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveExitTimestampMonotonic": "260738404", "ActiveEnterTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ActiveEnterTimestampMonotonic": "260824743", "ActiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ActiveExitTimestampMonotonic": "260719627", "InactiveEnterTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveEnterTimestampMonotonic": "260732561", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ConditionTimestampMonotonic": "260735742", "AssertTimestamp": "Fri 2024-09-20 21:27:50 EDT", "AssertTimestampMonotonic": "260735751", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "02f7cf7a90d5486687dc572c7e50e205", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 27885 1726882558.81197: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882558.411725-29298-251987129896929/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27885 1726882558.81295: _low_level_execute_command(): starting 27885 1726882558.81299: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882558.411725-29298-251987129896929/ > /dev/null 2>&1 && sleep 0' 27885 1726882558.81879: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882558.81900: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882558.81966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882558.82032: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882558.82050: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882558.82082: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882558.82194: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882558.84026: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882558.84075: stderr chunk (state=3): >>><<< 27885 1726882558.84078: stdout chunk (state=3): >>><<< 27885 1726882558.84119: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882558.84122: handler run complete 27885 1726882558.84183: attempt loop complete, returning result 27885 1726882558.84195: _execute() done 27885 1726882558.84226: dumping result to json 27885 1726882558.84232: done dumping result, returning 27885 1726882558.84244: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-3fa5-01be-00000000064d] 27885 1726882558.84254: sending task result for task 12673a56-9f93-3fa5-01be-00000000064d ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 27885 1726882558.84834: no more pending results, returning what we have 27885 1726882558.84838: results queue empty 27885 1726882558.84839: checking for any_errors_fatal 27885 1726882558.84845: done checking for any_errors_fatal 27885 1726882558.84846: checking for max_fail_percentage 27885 1726882558.84847: done checking for max_fail_percentage 27885 1726882558.84848: checking to see if all hosts have failed and the running result is not ok 27885 1726882558.84849: done checking to see if all hosts have failed 27885 1726882558.84849: getting the remaining hosts for this loop 27885 1726882558.84851: done getting the remaining hosts for this loop 27885 1726882558.84854: getting the next task for host managed_node2 27885 1726882558.84861: done getting next task for host managed_node2 27885 1726882558.84866: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 27885 1726882558.84870: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882558.84890: getting variables 27885 1726882558.84892: in VariableManager get_vars() 27885 1726882558.84932: Calling all_inventory to load vars for managed_node2 27885 1726882558.84935: Calling groups_inventory to load vars for managed_node2 27885 1726882558.84937: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882558.84948: Calling all_plugins_play to load vars for managed_node2 27885 1726882558.84952: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882558.84955: Calling groups_plugins_play to load vars for managed_node2 27885 1726882558.85609: done sending task result for task 12673a56-9f93-3fa5-01be-00000000064d 27885 1726882558.85613: WORKER PROCESS EXITING 27885 1726882558.86705: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882558.88325: done with get_vars() 27885 1726882558.88354: done getting variables 27885 1726882558.88422: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:35:58 -0400 (0:00:00.574) 0:00:31.527 ****** 27885 1726882558.88461: entering _queue_task() for managed_node2/service 27885 1726882558.88831: worker is 1 (out of 1 available) 27885 1726882558.88844: exiting _queue_task() for managed_node2/service 27885 1726882558.88856: done queuing things up, now waiting for results queue to drain 27885 1726882558.88858: waiting for pending results... 27885 1726882558.89166: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 27885 1726882558.89337: in run() - task 12673a56-9f93-3fa5-01be-00000000064e 27885 1726882558.89358: variable 'ansible_search_path' from source: unknown 27885 1726882558.89368: variable 'ansible_search_path' from source: unknown 27885 1726882558.89415: calling self._execute() 27885 1726882558.89528: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882558.89546: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882558.89559: variable 'omit' from source: magic vars 27885 1726882558.89923: variable 'ansible_distribution_major_version' from source: facts 27885 1726882558.89974: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882558.90057: variable 'network_provider' from source: set_fact 27885 1726882558.90068: Evaluated conditional (network_provider == "nm"): True 27885 1726882558.90161: variable '__network_wpa_supplicant_required' from source: role '' defaults 27885 1726882558.90261: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 27885 1726882558.90452: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27885 1726882558.92701: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27885 1726882558.92717: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27885 1726882558.92758: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27885 1726882558.92802: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27885 1726882558.92920: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27885 1726882558.92945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882558.92980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882558.93015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882558.93067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882558.93089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882558.93150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882558.93178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882558.93212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882558.93354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882558.93357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882558.93359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882558.93360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882558.93365: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882558.93405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882558.93419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882558.93564: variable 'network_connections' from source: include params 27885 1726882558.93597: variable 'interface0' from source: play vars 27885 1726882558.93695: variable 'interface0' from source: play vars 27885 1726882558.93795: variable 'interface1' from source: play vars 27885 1726882558.93798: variable 'interface1' from source: play vars 27885 1726882558.93852: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27885 1726882558.94024: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27885 1726882558.94062: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27885 1726882558.94099: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27885 1726882558.94139: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27885 1726882558.94184: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27885 1726882558.94215: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27885 1726882558.94345: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882558.94349: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27885 1726882558.94352: variable '__network_wireless_connections_defined' from source: role '' defaults 27885 1726882558.94619: variable 'network_connections' from source: include params 27885 1726882558.94629: variable 'interface0' from source: play vars 27885 1726882558.94704: variable 'interface0' from source: play vars 27885 1726882558.94716: variable 'interface1' from source: play vars 27885 1726882558.94780: variable 'interface1' from source: play vars 27885 1726882558.94821: Evaluated conditional (__network_wpa_supplicant_required): False 27885 1726882558.94831: when evaluation is False, skipping this task 27885 1726882558.94847: _execute() done 27885 1726882558.94854: dumping result to json 27885 1726882558.94895: done dumping result, returning 27885 1726882558.94901: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-3fa5-01be-00000000064e] 27885 1726882558.94903: sending task result for task 12673a56-9f93-3fa5-01be-00000000064e skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 27885 1726882558.95144: no more pending results, returning what we have 27885 1726882558.95147: results queue empty 27885 1726882558.95148: checking for any_errors_fatal 27885 1726882558.95176: done checking for any_errors_fatal 27885 1726882558.95177: checking for max_fail_percentage 27885 1726882558.95179: done checking for max_fail_percentage 27885 1726882558.95180: checking to see if all hosts have failed and the running result is not ok 27885 1726882558.95180: done checking to see if all hosts have failed 27885 1726882558.95181: getting the remaining hosts for this loop 27885 1726882558.95183: done getting the remaining hosts for this loop 27885 1726882558.95190: getting the next task for host managed_node2 27885 1726882558.95199: done getting next task for host managed_node2 27885 1726882558.95204: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 27885 1726882558.95209: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882558.95231: getting variables 27885 1726882558.95233: in VariableManager get_vars() 27885 1726882558.95277: Calling all_inventory to load vars for managed_node2 27885 1726882558.95281: Calling groups_inventory to load vars for managed_node2 27885 1726882558.95284: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882558.95413: Calling all_plugins_play to load vars for managed_node2 27885 1726882558.95418: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882558.95502: Calling groups_plugins_play to load vars for managed_node2 27885 1726882558.96027: done sending task result for task 12673a56-9f93-3fa5-01be-00000000064e 27885 1726882558.96031: WORKER PROCESS EXITING 27885 1726882558.96951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882558.98609: done with get_vars() 27885 1726882558.98638: done getting variables 27885 1726882558.98711: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:35:58 -0400 (0:00:00.102) 0:00:31.629 ****** 27885 1726882558.98744: entering _queue_task() for managed_node2/service 27885 1726882558.99112: worker is 1 (out of 1 available) 27885 1726882558.99124: exiting _queue_task() for managed_node2/service 27885 1726882558.99138: done queuing things up, now waiting for results queue to drain 27885 1726882558.99140: waiting for pending results... 27885 1726882558.99430: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 27885 1726882558.99568: in run() - task 12673a56-9f93-3fa5-01be-00000000064f 27885 1726882558.99590: variable 'ansible_search_path' from source: unknown 27885 1726882558.99600: variable 'ansible_search_path' from source: unknown 27885 1726882558.99644: calling self._execute() 27885 1726882558.99752: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882558.99763: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882558.99773: variable 'omit' from source: magic vars 27885 1726882559.00144: variable 'ansible_distribution_major_version' from source: facts 27885 1726882559.00169: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882559.00303: variable 'network_provider' from source: set_fact 27885 1726882559.00314: Evaluated conditional (network_provider == "initscripts"): False 27885 1726882559.00321: when evaluation is False, skipping this task 27885 1726882559.00379: _execute() done 27885 1726882559.00382: dumping result to json 27885 1726882559.00384: done dumping result, returning 27885 1726882559.00389: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-3fa5-01be-00000000064f] 27885 1726882559.00391: sending task result for task 12673a56-9f93-3fa5-01be-00000000064f skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 27885 1726882559.00536: no more pending results, returning what we have 27885 1726882559.00541: results queue empty 27885 1726882559.00542: checking for any_errors_fatal 27885 1726882559.00551: done checking for any_errors_fatal 27885 1726882559.00551: checking for max_fail_percentage 27885 1726882559.00553: done checking for max_fail_percentage 27885 1726882559.00554: checking to see if all hosts have failed and the running result is not ok 27885 1726882559.00555: done checking to see if all hosts have failed 27885 1726882559.00555: getting the remaining hosts for this loop 27885 1726882559.00558: done getting the remaining hosts for this loop 27885 1726882559.00561: getting the next task for host managed_node2 27885 1726882559.00571: done getting next task for host managed_node2 27885 1726882559.00576: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 27885 1726882559.00582: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882559.00612: getting variables 27885 1726882559.00614: in VariableManager get_vars() 27885 1726882559.00657: Calling all_inventory to load vars for managed_node2 27885 1726882559.00661: Calling groups_inventory to load vars for managed_node2 27885 1726882559.00663: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882559.00676: Calling all_plugins_play to load vars for managed_node2 27885 1726882559.00680: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882559.00683: Calling groups_plugins_play to load vars for managed_node2 27885 1726882559.01580: done sending task result for task 12673a56-9f93-3fa5-01be-00000000064f 27885 1726882559.01583: WORKER PROCESS EXITING 27885 1726882559.02484: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882559.04176: done with get_vars() 27885 1726882559.04203: done getting variables 27885 1726882559.04259: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:35:59 -0400 (0:00:00.055) 0:00:31.685 ****** 27885 1726882559.04302: entering _queue_task() for managed_node2/copy 27885 1726882559.04815: worker is 1 (out of 1 available) 27885 1726882559.04825: exiting _queue_task() for managed_node2/copy 27885 1726882559.04836: done queuing things up, now waiting for results queue to drain 27885 1726882559.04837: waiting for pending results... 27885 1726882559.04945: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 27885 1726882559.05096: in run() - task 12673a56-9f93-3fa5-01be-000000000650 27885 1726882559.05118: variable 'ansible_search_path' from source: unknown 27885 1726882559.05125: variable 'ansible_search_path' from source: unknown 27885 1726882559.05161: calling self._execute() 27885 1726882559.05267: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882559.05399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882559.05402: variable 'omit' from source: magic vars 27885 1726882559.05730: variable 'ansible_distribution_major_version' from source: facts 27885 1726882559.05748: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882559.05872: variable 'network_provider' from source: set_fact 27885 1726882559.05884: Evaluated conditional (network_provider == "initscripts"): False 27885 1726882559.05898: when evaluation is False, skipping this task 27885 1726882559.05907: _execute() done 27885 1726882559.05914: dumping result to json 27885 1726882559.05922: done dumping result, returning 27885 1726882559.05943: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-3fa5-01be-000000000650] 27885 1726882559.05954: sending task result for task 12673a56-9f93-3fa5-01be-000000000650 27885 1726882559.06182: done sending task result for task 12673a56-9f93-3fa5-01be-000000000650 27885 1726882559.06186: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 27885 1726882559.06246: no more pending results, returning what we have 27885 1726882559.06251: results queue empty 27885 1726882559.06252: checking for any_errors_fatal 27885 1726882559.06260: done checking for any_errors_fatal 27885 1726882559.06261: checking for max_fail_percentage 27885 1726882559.06263: done checking for max_fail_percentage 27885 1726882559.06263: checking to see if all hosts have failed and the running result is not ok 27885 1726882559.06264: done checking to see if all hosts have failed 27885 1726882559.06265: getting the remaining hosts for this loop 27885 1726882559.06267: done getting the remaining hosts for this loop 27885 1726882559.06271: getting the next task for host managed_node2 27885 1726882559.06280: done getting next task for host managed_node2 27885 1726882559.06284: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 27885 1726882559.06292: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882559.06321: getting variables 27885 1726882559.06323: in VariableManager get_vars() 27885 1726882559.06369: Calling all_inventory to load vars for managed_node2 27885 1726882559.06373: Calling groups_inventory to load vars for managed_node2 27885 1726882559.06375: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882559.06391: Calling all_plugins_play to load vars for managed_node2 27885 1726882559.06601: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882559.06606: Calling groups_plugins_play to load vars for managed_node2 27885 1726882559.07974: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882559.09749: done with get_vars() 27885 1726882559.09770: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:35:59 -0400 (0:00:00.055) 0:00:31.741 ****** 27885 1726882559.09866: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 27885 1726882559.10230: worker is 1 (out of 1 available) 27885 1726882559.10242: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 27885 1726882559.10255: done queuing things up, now waiting for results queue to drain 27885 1726882559.10257: waiting for pending results... 27885 1726882559.10702: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 27885 1726882559.10708: in run() - task 12673a56-9f93-3fa5-01be-000000000651 27885 1726882559.10735: variable 'ansible_search_path' from source: unknown 27885 1726882559.10744: variable 'ansible_search_path' from source: unknown 27885 1726882559.10784: calling self._execute() 27885 1726882559.10899: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882559.10912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882559.11041: variable 'omit' from source: magic vars 27885 1726882559.11341: variable 'ansible_distribution_major_version' from source: facts 27885 1726882559.11359: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882559.11378: variable 'omit' from source: magic vars 27885 1726882559.11433: variable 'omit' from source: magic vars 27885 1726882559.11597: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27885 1726882559.16625: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27885 1726882559.16859: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27885 1726882559.16864: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27885 1726882559.16867: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27885 1726882559.16988: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27885 1726882559.17152: variable 'network_provider' from source: set_fact 27885 1726882559.17590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27885 1726882559.17689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27885 1726882559.17784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27885 1726882559.17899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27885 1726882559.17980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27885 1726882559.18190: variable 'omit' from source: magic vars 27885 1726882559.18443: variable 'omit' from source: magic vars 27885 1726882559.18836: variable 'network_connections' from source: include params 27885 1726882559.18840: variable 'interface0' from source: play vars 27885 1726882559.18890: variable 'interface0' from source: play vars 27885 1726882559.18959: variable 'interface1' from source: play vars 27885 1726882559.19027: variable 'interface1' from source: play vars 27885 1726882559.19390: variable 'omit' from source: magic vars 27885 1726882559.19395: variable '__lsr_ansible_managed' from source: task vars 27885 1726882559.19461: variable '__lsr_ansible_managed' from source: task vars 27885 1726882559.19926: Loaded config def from plugin (lookup/template) 27885 1726882559.19929: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 27885 1726882559.19932: File lookup term: get_ansible_managed.j2 27885 1726882559.19934: variable 'ansible_search_path' from source: unknown 27885 1726882559.19937: evaluation_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 27885 1726882559.19946: search_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 27885 1726882559.19948: variable 'ansible_search_path' from source: unknown 27885 1726882559.26134: variable 'ansible_managed' from source: unknown 27885 1726882559.26138: variable 'omit' from source: magic vars 27885 1726882559.26141: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882559.26144: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882559.26146: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882559.26148: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882559.26158: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882559.26194: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882559.26199: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882559.26201: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882559.26298: Set connection var ansible_pipelining to False 27885 1726882559.26301: Set connection var ansible_connection to ssh 27885 1726882559.26308: Set connection var ansible_timeout to 10 27885 1726882559.26311: Set connection var ansible_shell_type to sh 27885 1726882559.26350: Set connection var ansible_shell_executable to /bin/sh 27885 1726882559.26353: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882559.26355: variable 'ansible_shell_executable' from source: unknown 27885 1726882559.26358: variable 'ansible_connection' from source: unknown 27885 1726882559.26360: variable 'ansible_module_compression' from source: unknown 27885 1726882559.26362: variable 'ansible_shell_type' from source: unknown 27885 1726882559.26365: variable 'ansible_shell_executable' from source: unknown 27885 1726882559.26367: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882559.26369: variable 'ansible_pipelining' from source: unknown 27885 1726882559.26370: variable 'ansible_timeout' from source: unknown 27885 1726882559.26372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882559.26568: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 27885 1726882559.26580: variable 'omit' from source: magic vars 27885 1726882559.26582: starting attempt loop 27885 1726882559.26585: running the handler 27885 1726882559.26590: _low_level_execute_command(): starting 27885 1726882559.26595: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27885 1726882559.27291: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882559.27318: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882559.27331: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882559.27350: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882559.27522: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882559.29195: stdout chunk (state=3): >>>/root <<< 27885 1726882559.29329: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882559.29334: stdout chunk (state=3): >>><<< 27885 1726882559.29343: stderr chunk (state=3): >>><<< 27885 1726882559.29515: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882559.29526: _low_level_execute_command(): starting 27885 1726882559.29531: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882559.295134-29317-151308909442319 `" && echo ansible-tmp-1726882559.295134-29317-151308909442319="` echo /root/.ansible/tmp/ansible-tmp-1726882559.295134-29317-151308909442319 `" ) && sleep 0' 27885 1726882559.30376: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882559.30380: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882559.30382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882559.30385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882559.30391: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882559.30396: stderr chunk (state=3): >>>debug2: match not found <<< 27885 1726882559.30591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882559.30597: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27885 1726882559.30600: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 27885 1726882559.30602: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27885 1726882559.30606: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882559.30608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882559.30610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882559.30612: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882559.30614: stderr chunk (state=3): >>>debug2: match found <<< 27885 1726882559.30616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882559.30618: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882559.30620: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882559.30622: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882559.30722: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882559.32601: stdout chunk (state=3): >>>ansible-tmp-1726882559.295134-29317-151308909442319=/root/.ansible/tmp/ansible-tmp-1726882559.295134-29317-151308909442319 <<< 27885 1726882559.32816: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882559.32820: stdout chunk (state=3): >>><<< 27885 1726882559.32829: stderr chunk (state=3): >>><<< 27885 1726882559.32885: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882559.295134-29317-151308909442319=/root/.ansible/tmp/ansible-tmp-1726882559.295134-29317-151308909442319 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882559.32938: variable 'ansible_module_compression' from source: unknown 27885 1726882559.33101: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27885_1t3g5hz/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 27885 1726882559.33132: variable 'ansible_facts' from source: unknown 27885 1726882559.33351: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882559.295134-29317-151308909442319/AnsiballZ_network_connections.py 27885 1726882559.33664: Sending initial data 27885 1726882559.33678: Sent initial data (167 bytes) 27885 1726882559.34855: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882559.34868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 27885 1726882559.34973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882559.35185: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882559.35323: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882559.35439: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882559.36916: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 27885 1726882559.36925: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27885 1726882559.36985: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27885 1726882559.37064: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmplafvj2mk /root/.ansible/tmp/ansible-tmp-1726882559.295134-29317-151308909442319/AnsiballZ_network_connections.py <<< 27885 1726882559.37067: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882559.295134-29317-151308909442319/AnsiballZ_network_connections.py" <<< 27885 1726882559.37133: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmplafvj2mk" to remote "/root/.ansible/tmp/ansible-tmp-1726882559.295134-29317-151308909442319/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882559.295134-29317-151308909442319/AnsiballZ_network_connections.py" <<< 27885 1726882559.38686: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882559.38689: stderr chunk (state=3): >>><<< 27885 1726882559.38692: stdout chunk (state=3): >>><<< 27885 1726882559.38696: done transferring module to remote 27885 1726882559.38698: _low_level_execute_command(): starting 27885 1726882559.38701: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882559.295134-29317-151308909442319/ /root/.ansible/tmp/ansible-tmp-1726882559.295134-29317-151308909442319/AnsiballZ_network_connections.py && sleep 0' 27885 1726882559.39458: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882559.39465: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882559.39475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882559.39489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882559.39641: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882559.39645: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882559.39647: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882559.39704: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882559.41565: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882559.41569: stdout chunk (state=3): >>><<< 27885 1726882559.41575: stderr chunk (state=3): >>><<< 27885 1726882559.41674: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882559.41683: _low_level_execute_command(): starting 27885 1726882559.41688: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882559.295134-29317-151308909442319/AnsiballZ_network_connections.py && sleep 0' 27885 1726882559.42753: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882559.42808: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882559.43029: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882559.43048: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882559.43127: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882559.82547: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_dyosn9cz/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_dyosn9cz/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest0/4e1ae63c-1d45-4cf6-8c23-7abd72c157ff: error=unknown <<< 27885 1726882559.84262: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_dyosn9cz/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back <<< 27885 1726882559.84267: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_dyosn9cz/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest1/9dc4eccb-4733-427c-9f5b-05ec76f599ff: error=unknown <<< 27885 1726882559.84469: stdout chunk (state=3): >>> <<< 27885 1726882559.84619: stdout chunk (state=3): >>>{"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent", "state": "down"}, {"name": "ethtest1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent", "state": "down"}, {"name": "ethtest1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 27885 1726882559.86315: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 27885 1726882559.86348: stderr chunk (state=3): >>><<< 27885 1726882559.86351: stdout chunk (state=3): >>><<< 27885 1726882559.86369: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_dyosn9cz/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_dyosn9cz/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest0/4e1ae63c-1d45-4cf6-8c23-7abd72c157ff: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_dyosn9cz/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_dyosn9cz/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest1/9dc4eccb-4733-427c-9f5b-05ec76f599ff: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent", "state": "down"}, {"name": "ethtest1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent", "state": "down"}, {"name": "ethtest1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 27885 1726882559.86406: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'ethtest1', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882559.295134-29317-151308909442319/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27885 1726882559.86417: _low_level_execute_command(): starting 27885 1726882559.86422: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882559.295134-29317-151308909442319/ > /dev/null 2>&1 && sleep 0' 27885 1726882559.86888: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882559.86892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882559.86896: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 27885 1726882559.86898: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882559.86900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882559.86949: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882559.86957: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882559.86959: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882559.87019: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882559.88847: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882559.88872: stderr chunk (state=3): >>><<< 27885 1726882559.88875: stdout chunk (state=3): >>><<< 27885 1726882559.88892: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882559.88899: handler run complete 27885 1726882559.88919: attempt loop complete, returning result 27885 1726882559.88922: _execute() done 27885 1726882559.88924: dumping result to json 27885 1726882559.88930: done dumping result, returning 27885 1726882559.88939: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-3fa5-01be-000000000651] 27885 1726882559.88944: sending task result for task 12673a56-9f93-3fa5-01be-000000000651 27885 1726882559.89049: done sending task result for task 12673a56-9f93-3fa5-01be-000000000651 27885 1726882559.89051: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "persistent_state": "absent", "state": "down" }, { "name": "ethtest1", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 27885 1726882559.89157: no more pending results, returning what we have 27885 1726882559.89160: results queue empty 27885 1726882559.89161: checking for any_errors_fatal 27885 1726882559.89173: done checking for any_errors_fatal 27885 1726882559.89174: checking for max_fail_percentage 27885 1726882559.89176: done checking for max_fail_percentage 27885 1726882559.89176: checking to see if all hosts have failed and the running result is not ok 27885 1726882559.89177: done checking to see if all hosts have failed 27885 1726882559.89177: getting the remaining hosts for this loop 27885 1726882559.89179: done getting the remaining hosts for this loop 27885 1726882559.89183: getting the next task for host managed_node2 27885 1726882559.89192: done getting next task for host managed_node2 27885 1726882559.89197: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 27885 1726882559.89201: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882559.89212: getting variables 27885 1726882559.89214: in VariableManager get_vars() 27885 1726882559.89252: Calling all_inventory to load vars for managed_node2 27885 1726882559.89254: Calling groups_inventory to load vars for managed_node2 27885 1726882559.89256: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882559.89264: Calling all_plugins_play to load vars for managed_node2 27885 1726882559.89267: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882559.89270: Calling groups_plugins_play to load vars for managed_node2 27885 1726882559.90074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882559.90945: done with get_vars() 27885 1726882559.90961: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:35:59 -0400 (0:00:00.811) 0:00:32.552 ****** 27885 1726882559.91027: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 27885 1726882559.91253: worker is 1 (out of 1 available) 27885 1726882559.91266: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 27885 1726882559.91281: done queuing things up, now waiting for results queue to drain 27885 1726882559.91283: waiting for pending results... 27885 1726882559.91455: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 27885 1726882559.91539: in run() - task 12673a56-9f93-3fa5-01be-000000000652 27885 1726882559.91550: variable 'ansible_search_path' from source: unknown 27885 1726882559.91553: variable 'ansible_search_path' from source: unknown 27885 1726882559.91580: calling self._execute() 27885 1726882559.91657: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882559.91661: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882559.91671: variable 'omit' from source: magic vars 27885 1726882559.91946: variable 'ansible_distribution_major_version' from source: facts 27885 1726882559.91956: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882559.92034: variable 'network_state' from source: role '' defaults 27885 1726882559.92043: Evaluated conditional (network_state != {}): False 27885 1726882559.92048: when evaluation is False, skipping this task 27885 1726882559.92051: _execute() done 27885 1726882559.92054: dumping result to json 27885 1726882559.92058: done dumping result, returning 27885 1726882559.92061: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-3fa5-01be-000000000652] 27885 1726882559.92072: sending task result for task 12673a56-9f93-3fa5-01be-000000000652 27885 1726882559.92148: done sending task result for task 12673a56-9f93-3fa5-01be-000000000652 27885 1726882559.92151: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 27885 1726882559.92222: no more pending results, returning what we have 27885 1726882559.92225: results queue empty 27885 1726882559.92226: checking for any_errors_fatal 27885 1726882559.92233: done checking for any_errors_fatal 27885 1726882559.92234: checking for max_fail_percentage 27885 1726882559.92235: done checking for max_fail_percentage 27885 1726882559.92236: checking to see if all hosts have failed and the running result is not ok 27885 1726882559.92237: done checking to see if all hosts have failed 27885 1726882559.92237: getting the remaining hosts for this loop 27885 1726882559.92238: done getting the remaining hosts for this loop 27885 1726882559.92241: getting the next task for host managed_node2 27885 1726882559.92247: done getting next task for host managed_node2 27885 1726882559.92250: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 27885 1726882559.92254: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882559.92269: getting variables 27885 1726882559.92270: in VariableManager get_vars() 27885 1726882559.92306: Calling all_inventory to load vars for managed_node2 27885 1726882559.92309: Calling groups_inventory to load vars for managed_node2 27885 1726882559.92311: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882559.92318: Calling all_plugins_play to load vars for managed_node2 27885 1726882559.92321: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882559.92324: Calling groups_plugins_play to load vars for managed_node2 27885 1726882559.93175: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882559.94108: done with get_vars() 27885 1726882559.94136: done getting variables 27885 1726882559.94209: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:35:59 -0400 (0:00:00.032) 0:00:32.584 ****** 27885 1726882559.94241: entering _queue_task() for managed_node2/debug 27885 1726882559.94458: worker is 1 (out of 1 available) 27885 1726882559.94473: exiting _queue_task() for managed_node2/debug 27885 1726882559.94485: done queuing things up, now waiting for results queue to drain 27885 1726882559.94486: waiting for pending results... 27885 1726882559.94662: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 27885 1726882559.94745: in run() - task 12673a56-9f93-3fa5-01be-000000000653 27885 1726882559.94758: variable 'ansible_search_path' from source: unknown 27885 1726882559.94761: variable 'ansible_search_path' from source: unknown 27885 1726882559.94787: calling self._execute() 27885 1726882559.94873: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882559.94877: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882559.94886: variable 'omit' from source: magic vars 27885 1726882559.95157: variable 'ansible_distribution_major_version' from source: facts 27885 1726882559.95167: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882559.95173: variable 'omit' from source: magic vars 27885 1726882559.95209: variable 'omit' from source: magic vars 27885 1726882559.95233: variable 'omit' from source: magic vars 27885 1726882559.95266: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882559.95295: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882559.95311: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882559.95324: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882559.95334: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882559.95356: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882559.95362: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882559.95366: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882559.95439: Set connection var ansible_pipelining to False 27885 1726882559.95443: Set connection var ansible_connection to ssh 27885 1726882559.95448: Set connection var ansible_timeout to 10 27885 1726882559.95451: Set connection var ansible_shell_type to sh 27885 1726882559.95456: Set connection var ansible_shell_executable to /bin/sh 27885 1726882559.95461: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882559.95485: variable 'ansible_shell_executable' from source: unknown 27885 1726882559.95491: variable 'ansible_connection' from source: unknown 27885 1726882559.95496: variable 'ansible_module_compression' from source: unknown 27885 1726882559.95498: variable 'ansible_shell_type' from source: unknown 27885 1726882559.95501: variable 'ansible_shell_executable' from source: unknown 27885 1726882559.95503: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882559.95505: variable 'ansible_pipelining' from source: unknown 27885 1726882559.95507: variable 'ansible_timeout' from source: unknown 27885 1726882559.95509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882559.95597: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882559.95612: variable 'omit' from source: magic vars 27885 1726882559.95617: starting attempt loop 27885 1726882559.95620: running the handler 27885 1726882559.95712: variable '__network_connections_result' from source: set_fact 27885 1726882559.95752: handler run complete 27885 1726882559.95765: attempt loop complete, returning result 27885 1726882559.95768: _execute() done 27885 1726882559.95770: dumping result to json 27885 1726882559.95773: done dumping result, returning 27885 1726882559.95781: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-3fa5-01be-000000000653] 27885 1726882559.95786: sending task result for task 12673a56-9f93-3fa5-01be-000000000653 27885 1726882559.95877: done sending task result for task 12673a56-9f93-3fa5-01be-000000000653 27885 1726882559.95880: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "" ] } 27885 1726882559.96054: no more pending results, returning what we have 27885 1726882559.96057: results queue empty 27885 1726882559.96058: checking for any_errors_fatal 27885 1726882559.96062: done checking for any_errors_fatal 27885 1726882559.96063: checking for max_fail_percentage 27885 1726882559.96064: done checking for max_fail_percentage 27885 1726882559.96065: checking to see if all hosts have failed and the running result is not ok 27885 1726882559.96065: done checking to see if all hosts have failed 27885 1726882559.96066: getting the remaining hosts for this loop 27885 1726882559.96067: done getting the remaining hosts for this loop 27885 1726882559.96070: getting the next task for host managed_node2 27885 1726882559.96075: done getting next task for host managed_node2 27885 1726882559.96078: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 27885 1726882559.96081: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882559.96090: getting variables 27885 1726882559.96092: in VariableManager get_vars() 27885 1726882559.96130: Calling all_inventory to load vars for managed_node2 27885 1726882559.96132: Calling groups_inventory to load vars for managed_node2 27885 1726882559.96134: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882559.96142: Calling all_plugins_play to load vars for managed_node2 27885 1726882559.96145: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882559.96148: Calling groups_plugins_play to load vars for managed_node2 27885 1726882559.97477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882559.99134: done with get_vars() 27885 1726882559.99157: done getting variables 27885 1726882559.99221: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:35:59 -0400 (0:00:00.050) 0:00:32.635 ****** 27885 1726882559.99264: entering _queue_task() for managed_node2/debug 27885 1726882559.99533: worker is 1 (out of 1 available) 27885 1726882559.99548: exiting _queue_task() for managed_node2/debug 27885 1726882559.99560: done queuing things up, now waiting for results queue to drain 27885 1726882559.99561: waiting for pending results... 27885 1726882559.99757: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 27885 1726882559.99843: in run() - task 12673a56-9f93-3fa5-01be-000000000654 27885 1726882559.99855: variable 'ansible_search_path' from source: unknown 27885 1726882559.99859: variable 'ansible_search_path' from source: unknown 27885 1726882559.99899: calling self._execute() 27885 1726882559.99968: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882559.99972: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882559.99982: variable 'omit' from source: magic vars 27885 1726882560.00264: variable 'ansible_distribution_major_version' from source: facts 27885 1726882560.00273: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882560.00280: variable 'omit' from source: magic vars 27885 1726882560.00324: variable 'omit' from source: magic vars 27885 1726882560.00348: variable 'omit' from source: magic vars 27885 1726882560.00379: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882560.00411: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882560.00428: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882560.00442: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882560.00453: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882560.00475: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882560.00478: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882560.00481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882560.00555: Set connection var ansible_pipelining to False 27885 1726882560.00558: Set connection var ansible_connection to ssh 27885 1726882560.00563: Set connection var ansible_timeout to 10 27885 1726882560.00566: Set connection var ansible_shell_type to sh 27885 1726882560.00571: Set connection var ansible_shell_executable to /bin/sh 27885 1726882560.00576: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882560.00597: variable 'ansible_shell_executable' from source: unknown 27885 1726882560.00600: variable 'ansible_connection' from source: unknown 27885 1726882560.00603: variable 'ansible_module_compression' from source: unknown 27885 1726882560.00605: variable 'ansible_shell_type' from source: unknown 27885 1726882560.00607: variable 'ansible_shell_executable' from source: unknown 27885 1726882560.00609: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882560.00613: variable 'ansible_pipelining' from source: unknown 27885 1726882560.00615: variable 'ansible_timeout' from source: unknown 27885 1726882560.00619: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882560.00724: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882560.00732: variable 'omit' from source: magic vars 27885 1726882560.00738: starting attempt loop 27885 1726882560.00741: running the handler 27885 1726882560.00782: variable '__network_connections_result' from source: set_fact 27885 1726882560.00840: variable '__network_connections_result' from source: set_fact 27885 1726882560.00925: handler run complete 27885 1726882560.00942: attempt loop complete, returning result 27885 1726882560.00945: _execute() done 27885 1726882560.00947: dumping result to json 27885 1726882560.00952: done dumping result, returning 27885 1726882560.00960: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-3fa5-01be-000000000654] 27885 1726882560.00964: sending task result for task 12673a56-9f93-3fa5-01be-000000000654 27885 1726882560.01051: done sending task result for task 12673a56-9f93-3fa5-01be-000000000654 27885 1726882560.01054: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "persistent_state": "absent", "state": "down" }, { "name": "ethtest1", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 27885 1726882560.01186: no more pending results, returning what we have 27885 1726882560.01190: results queue empty 27885 1726882560.01191: checking for any_errors_fatal 27885 1726882560.01201: done checking for any_errors_fatal 27885 1726882560.01201: checking for max_fail_percentage 27885 1726882560.01203: done checking for max_fail_percentage 27885 1726882560.01204: checking to see if all hosts have failed and the running result is not ok 27885 1726882560.01204: done checking to see if all hosts have failed 27885 1726882560.01205: getting the remaining hosts for this loop 27885 1726882560.01207: done getting the remaining hosts for this loop 27885 1726882560.01210: getting the next task for host managed_node2 27885 1726882560.01218: done getting next task for host managed_node2 27885 1726882560.01222: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 27885 1726882560.01225: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882560.01236: getting variables 27885 1726882560.01238: in VariableManager get_vars() 27885 1726882560.01272: Calling all_inventory to load vars for managed_node2 27885 1726882560.01275: Calling groups_inventory to load vars for managed_node2 27885 1726882560.01276: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882560.01284: Calling all_plugins_play to load vars for managed_node2 27885 1726882560.01286: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882560.01288: Calling groups_plugins_play to load vars for managed_node2 27885 1726882560.02825: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882560.04918: done with get_vars() 27885 1726882560.04948: done getting variables 27885 1726882560.05006: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:36:00 -0400 (0:00:00.057) 0:00:32.692 ****** 27885 1726882560.05039: entering _queue_task() for managed_node2/debug 27885 1726882560.05500: worker is 1 (out of 1 available) 27885 1726882560.05511: exiting _queue_task() for managed_node2/debug 27885 1726882560.05522: done queuing things up, now waiting for results queue to drain 27885 1726882560.05523: waiting for pending results... 27885 1726882560.05818: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 27885 1726882560.05864: in run() - task 12673a56-9f93-3fa5-01be-000000000655 27885 1726882560.05885: variable 'ansible_search_path' from source: unknown 27885 1726882560.05909: variable 'ansible_search_path' from source: unknown 27885 1726882560.05944: calling self._execute() 27885 1726882560.06127: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882560.06130: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882560.06133: variable 'omit' from source: magic vars 27885 1726882560.06502: variable 'ansible_distribution_major_version' from source: facts 27885 1726882560.06518: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882560.06647: variable 'network_state' from source: role '' defaults 27885 1726882560.06665: Evaluated conditional (network_state != {}): False 27885 1726882560.06676: when evaluation is False, skipping this task 27885 1726882560.06692: _execute() done 27885 1726882560.06797: dumping result to json 27885 1726882560.06801: done dumping result, returning 27885 1726882560.06803: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-3fa5-01be-000000000655] 27885 1726882560.06805: sending task result for task 12673a56-9f93-3fa5-01be-000000000655 27885 1726882560.06867: done sending task result for task 12673a56-9f93-3fa5-01be-000000000655 27885 1726882560.06870: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 27885 1726882560.06924: no more pending results, returning what we have 27885 1726882560.06929: results queue empty 27885 1726882560.06930: checking for any_errors_fatal 27885 1726882560.06939: done checking for any_errors_fatal 27885 1726882560.06940: checking for max_fail_percentage 27885 1726882560.06941: done checking for max_fail_percentage 27885 1726882560.06942: checking to see if all hosts have failed and the running result is not ok 27885 1726882560.06943: done checking to see if all hosts have failed 27885 1726882560.06944: getting the remaining hosts for this loop 27885 1726882560.06946: done getting the remaining hosts for this loop 27885 1726882560.06949: getting the next task for host managed_node2 27885 1726882560.06958: done getting next task for host managed_node2 27885 1726882560.06962: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 27885 1726882560.06967: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882560.06988: getting variables 27885 1726882560.06990: in VariableManager get_vars() 27885 1726882560.07136: Calling all_inventory to load vars for managed_node2 27885 1726882560.07139: Calling groups_inventory to load vars for managed_node2 27885 1726882560.07141: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882560.07152: Calling all_plugins_play to load vars for managed_node2 27885 1726882560.07155: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882560.07157: Calling groups_plugins_play to load vars for managed_node2 27885 1726882560.08681: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882560.11175: done with get_vars() 27885 1726882560.11199: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:36:00 -0400 (0:00:00.062) 0:00:32.755 ****** 27885 1726882560.11294: entering _queue_task() for managed_node2/ping 27885 1726882560.11726: worker is 1 (out of 1 available) 27885 1726882560.11737: exiting _queue_task() for managed_node2/ping 27885 1726882560.11748: done queuing things up, now waiting for results queue to drain 27885 1726882560.11749: waiting for pending results... 27885 1726882560.12012: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 27885 1726882560.12300: in run() - task 12673a56-9f93-3fa5-01be-000000000656 27885 1726882560.12304: variable 'ansible_search_path' from source: unknown 27885 1726882560.12306: variable 'ansible_search_path' from source: unknown 27885 1726882560.12309: calling self._execute() 27885 1726882560.12312: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882560.12314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882560.12317: variable 'omit' from source: magic vars 27885 1726882560.12640: variable 'ansible_distribution_major_version' from source: facts 27885 1726882560.12661: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882560.12671: variable 'omit' from source: magic vars 27885 1726882560.12734: variable 'omit' from source: magic vars 27885 1726882560.12776: variable 'omit' from source: magic vars 27885 1726882560.12823: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882560.12868: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882560.12894: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882560.12917: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882560.12975: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882560.12978: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882560.12981: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882560.12983: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882560.13081: Set connection var ansible_pipelining to False 27885 1726882560.13097: Set connection var ansible_connection to ssh 27885 1726882560.13107: Set connection var ansible_timeout to 10 27885 1726882560.13113: Set connection var ansible_shell_type to sh 27885 1726882560.13122: Set connection var ansible_shell_executable to /bin/sh 27885 1726882560.13130: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882560.13157: variable 'ansible_shell_executable' from source: unknown 27885 1726882560.13198: variable 'ansible_connection' from source: unknown 27885 1726882560.13201: variable 'ansible_module_compression' from source: unknown 27885 1726882560.13203: variable 'ansible_shell_type' from source: unknown 27885 1726882560.13205: variable 'ansible_shell_executable' from source: unknown 27885 1726882560.13207: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882560.13209: variable 'ansible_pipelining' from source: unknown 27885 1726882560.13210: variable 'ansible_timeout' from source: unknown 27885 1726882560.13212: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882560.13400: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 27885 1726882560.13422: variable 'omit' from source: magic vars 27885 1726882560.13498: starting attempt loop 27885 1726882560.13501: running the handler 27885 1726882560.13503: _low_level_execute_command(): starting 27885 1726882560.13505: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27885 1726882560.14212: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882560.14275: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882560.14305: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882560.14620: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882560.14692: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882560.16323: stdout chunk (state=3): >>>/root <<< 27885 1726882560.16484: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882560.16487: stdout chunk (state=3): >>><<< 27885 1726882560.16490: stderr chunk (state=3): >>><<< 27885 1726882560.16513: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882560.16534: _low_level_execute_command(): starting 27885 1726882560.16576: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882560.1652026-29357-86298184775306 `" && echo ansible-tmp-1726882560.1652026-29357-86298184775306="` echo /root/.ansible/tmp/ansible-tmp-1726882560.1652026-29357-86298184775306 `" ) && sleep 0' 27885 1726882560.17810: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882560.17961: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882560.18198: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882560.18201: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882560.18204: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882560.20045: stdout chunk (state=3): >>>ansible-tmp-1726882560.1652026-29357-86298184775306=/root/.ansible/tmp/ansible-tmp-1726882560.1652026-29357-86298184775306 <<< 27885 1726882560.20188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882560.20317: stderr chunk (state=3): >>><<< 27885 1726882560.20320: stdout chunk (state=3): >>><<< 27885 1726882560.20332: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882560.1652026-29357-86298184775306=/root/.ansible/tmp/ansible-tmp-1726882560.1652026-29357-86298184775306 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882560.20425: variable 'ansible_module_compression' from source: unknown 27885 1726882560.20642: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27885_1t3g5hz/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 27885 1726882560.21075: variable 'ansible_facts' from source: unknown 27885 1726882560.21079: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882560.1652026-29357-86298184775306/AnsiballZ_ping.py 27885 1726882560.21770: Sending initial data 27885 1726882560.21774: Sent initial data (152 bytes) 27885 1726882560.22778: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882560.22961: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882560.23119: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882560.23143: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882560.23331: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882560.24781: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27885 1726882560.24858: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27885 1726882560.24917: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmp7y8zzxx_ /root/.ansible/tmp/ansible-tmp-1726882560.1652026-29357-86298184775306/AnsiballZ_ping.py <<< 27885 1726882560.24921: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882560.1652026-29357-86298184775306/AnsiballZ_ping.py" <<< 27885 1726882560.25021: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmp7y8zzxx_" to remote "/root/.ansible/tmp/ansible-tmp-1726882560.1652026-29357-86298184775306/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882560.1652026-29357-86298184775306/AnsiballZ_ping.py" <<< 27885 1726882560.26427: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882560.26430: stdout chunk (state=3): >>><<< 27885 1726882560.26440: stderr chunk (state=3): >>><<< 27885 1726882560.26678: done transferring module to remote 27885 1726882560.26680: _low_level_execute_command(): starting 27885 1726882560.26683: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882560.1652026-29357-86298184775306/ /root/.ansible/tmp/ansible-tmp-1726882560.1652026-29357-86298184775306/AnsiballZ_ping.py && sleep 0' 27885 1726882560.27784: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882560.27853: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882560.27864: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882560.27944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882560.27955: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882560.27962: stderr chunk (state=3): >>>debug2: match not found <<< 27885 1726882560.27972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882560.27986: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27885 1726882560.27996: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 27885 1726882560.28003: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27885 1726882560.28011: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882560.28021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882560.28037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882560.28160: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882560.28299: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882560.28302: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882560.30073: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882560.30076: stdout chunk (state=3): >>><<< 27885 1726882560.30082: stderr chunk (state=3): >>><<< 27885 1726882560.30099: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882560.30102: _low_level_execute_command(): starting 27885 1726882560.30107: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882560.1652026-29357-86298184775306/AnsiballZ_ping.py && sleep 0' 27885 1726882560.31511: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882560.31515: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882560.31636: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882560.31705: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882560.46426: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 27885 1726882560.47557: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882560.47628: stderr chunk (state=3): >>>Shared connection to 10.31.14.69 closed. <<< 27885 1726882560.47649: stdout chunk (state=3): >>><<< 27885 1726882560.47978: stderr chunk (state=3): >>><<< 27885 1726882560.47982: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 27885 1726882560.47984: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882560.1652026-29357-86298184775306/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27885 1726882560.47990: _low_level_execute_command(): starting 27885 1726882560.47995: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882560.1652026-29357-86298184775306/ > /dev/null 2>&1 && sleep 0' 27885 1726882560.49201: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882560.49318: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882560.49437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882560.49549: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882560.49721: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882560.51580: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882560.51610: stdout chunk (state=3): >>><<< 27885 1726882560.51627: stderr chunk (state=3): >>><<< 27885 1726882560.51651: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882560.51716: handler run complete 27885 1726882560.51738: attempt loop complete, returning result 27885 1726882560.52002: _execute() done 27885 1726882560.52005: dumping result to json 27885 1726882560.52007: done dumping result, returning 27885 1726882560.52009: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-3fa5-01be-000000000656] 27885 1726882560.52011: sending task result for task 12673a56-9f93-3fa5-01be-000000000656 27885 1726882560.52078: done sending task result for task 12673a56-9f93-3fa5-01be-000000000656 27885 1726882560.52081: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 27885 1726882560.52169: no more pending results, returning what we have 27885 1726882560.52173: results queue empty 27885 1726882560.52174: checking for any_errors_fatal 27885 1726882560.52180: done checking for any_errors_fatal 27885 1726882560.52181: checking for max_fail_percentage 27885 1726882560.52183: done checking for max_fail_percentage 27885 1726882560.52184: checking to see if all hosts have failed and the running result is not ok 27885 1726882560.52184: done checking to see if all hosts have failed 27885 1726882560.52185: getting the remaining hosts for this loop 27885 1726882560.52189: done getting the remaining hosts for this loop 27885 1726882560.52196: getting the next task for host managed_node2 27885 1726882560.52206: done getting next task for host managed_node2 27885 1726882560.52209: ^ task is: TASK: meta (role_complete) 27885 1726882560.52215: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882560.52227: getting variables 27885 1726882560.52229: in VariableManager get_vars() 27885 1726882560.52275: Calling all_inventory to load vars for managed_node2 27885 1726882560.52278: Calling groups_inventory to load vars for managed_node2 27885 1726882560.52280: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882560.52519: Calling all_plugins_play to load vars for managed_node2 27885 1726882560.52525: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882560.52528: Calling groups_plugins_play to load vars for managed_node2 27885 1726882560.55726: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882560.59101: done with get_vars() 27885 1726882560.59135: done getting variables 27885 1726882560.59340: done queuing things up, now waiting for results queue to drain 27885 1726882560.59343: results queue empty 27885 1726882560.59344: checking for any_errors_fatal 27885 1726882560.59347: done checking for any_errors_fatal 27885 1726882560.59348: checking for max_fail_percentage 27885 1726882560.59349: done checking for max_fail_percentage 27885 1726882560.59350: checking to see if all hosts have failed and the running result is not ok 27885 1726882560.59351: done checking to see if all hosts have failed 27885 1726882560.59351: getting the remaining hosts for this loop 27885 1726882560.59352: done getting the remaining hosts for this loop 27885 1726882560.59355: getting the next task for host managed_node2 27885 1726882560.59360: done getting next task for host managed_node2 27885 1726882560.59363: ^ task is: TASK: Delete interface1 27885 1726882560.59365: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882560.59367: getting variables 27885 1726882560.59368: in VariableManager get_vars() 27885 1726882560.59383: Calling all_inventory to load vars for managed_node2 27885 1726882560.59385: Calling groups_inventory to load vars for managed_node2 27885 1726882560.59509: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882560.59516: Calling all_plugins_play to load vars for managed_node2 27885 1726882560.59519: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882560.59522: Calling groups_plugins_play to load vars for managed_node2 27885 1726882560.62405: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882560.64061: done with get_vars() 27885 1726882560.64096: done getting variables TASK [Delete interface1] ******************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:151 Friday 20 September 2024 21:36:00 -0400 (0:00:00.528) 0:00:33.284 ****** 27885 1726882560.64165: entering _queue_task() for managed_node2/include_tasks 27885 1726882560.64837: worker is 1 (out of 1 available) 27885 1726882560.64849: exiting _queue_task() for managed_node2/include_tasks 27885 1726882560.64861: done queuing things up, now waiting for results queue to drain 27885 1726882560.64862: waiting for pending results... 27885 1726882560.65353: running TaskExecutor() for managed_node2/TASK: Delete interface1 27885 1726882560.65553: in run() - task 12673a56-9f93-3fa5-01be-0000000000b5 27885 1726882560.65577: variable 'ansible_search_path' from source: unknown 27885 1726882560.65724: calling self._execute() 27885 1726882560.65804: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882560.65812: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882560.65852: variable 'omit' from source: magic vars 27885 1726882560.66233: variable 'ansible_distribution_major_version' from source: facts 27885 1726882560.66244: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882560.66253: _execute() done 27885 1726882560.66291: dumping result to json 27885 1726882560.66297: done dumping result, returning 27885 1726882560.66300: done running TaskExecutor() for managed_node2/TASK: Delete interface1 [12673a56-9f93-3fa5-01be-0000000000b5] 27885 1726882560.66302: sending task result for task 12673a56-9f93-3fa5-01be-0000000000b5 27885 1726882560.66372: done sending task result for task 12673a56-9f93-3fa5-01be-0000000000b5 27885 1726882560.66375: WORKER PROCESS EXITING 27885 1726882560.66417: no more pending results, returning what we have 27885 1726882560.66422: in VariableManager get_vars() 27885 1726882560.66468: Calling all_inventory to load vars for managed_node2 27885 1726882560.66471: Calling groups_inventory to load vars for managed_node2 27885 1726882560.66473: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882560.66485: Calling all_plugins_play to load vars for managed_node2 27885 1726882560.66491: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882560.66495: Calling groups_plugins_play to load vars for managed_node2 27885 1726882560.67781: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882560.69755: done with get_vars() 27885 1726882560.69780: variable 'ansible_search_path' from source: unknown 27885 1726882560.69862: we have included files to process 27885 1726882560.69864: generating all_blocks data 27885 1726882560.69866: done generating all_blocks data 27885 1726882560.69871: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 27885 1726882560.69872: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 27885 1726882560.69876: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 27885 1726882560.70142: done processing included file 27885 1726882560.70144: iterating over new_blocks loaded from include file 27885 1726882560.70145: in VariableManager get_vars() 27885 1726882560.70209: done with get_vars() 27885 1726882560.70214: filtering new block on tags 27885 1726882560.70267: done filtering new block on tags 27885 1726882560.70269: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed_node2 27885 1726882560.70280: extending task lists for all hosts with included blocks 27885 1726882560.71738: done extending task lists 27885 1726882560.71773: done processing included files 27885 1726882560.71775: results queue empty 27885 1726882560.71775: checking for any_errors_fatal 27885 1726882560.71777: done checking for any_errors_fatal 27885 1726882560.71778: checking for max_fail_percentage 27885 1726882560.71779: done checking for max_fail_percentage 27885 1726882560.71780: checking to see if all hosts have failed and the running result is not ok 27885 1726882560.71781: done checking to see if all hosts have failed 27885 1726882560.71782: getting the remaining hosts for this loop 27885 1726882560.71785: done getting the remaining hosts for this loop 27885 1726882560.71790: getting the next task for host managed_node2 27885 1726882560.71828: done getting next task for host managed_node2 27885 1726882560.71831: ^ task is: TASK: Remove test interface if necessary 27885 1726882560.71878: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882560.71920: getting variables 27885 1726882560.71921: in VariableManager get_vars() 27885 1726882560.71983: Calling all_inventory to load vars for managed_node2 27885 1726882560.71988: Calling groups_inventory to load vars for managed_node2 27885 1726882560.71990: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882560.71999: Calling all_plugins_play to load vars for managed_node2 27885 1726882560.72001: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882560.72004: Calling groups_plugins_play to load vars for managed_node2 27885 1726882560.79064: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882560.80732: done with get_vars() 27885 1726882560.80766: done getting variables 27885 1726882560.80818: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Friday 20 September 2024 21:36:00 -0400 (0:00:00.166) 0:00:33.451 ****** 27885 1726882560.80852: entering _queue_task() for managed_node2/command 27885 1726882560.81299: worker is 1 (out of 1 available) 27885 1726882560.81314: exiting _queue_task() for managed_node2/command 27885 1726882560.81329: done queuing things up, now waiting for results queue to drain 27885 1726882560.81330: waiting for pending results... 27885 1726882560.81717: running TaskExecutor() for managed_node2/TASK: Remove test interface if necessary 27885 1726882560.81763: in run() - task 12673a56-9f93-3fa5-01be-000000000777 27885 1726882560.81789: variable 'ansible_search_path' from source: unknown 27885 1726882560.81801: variable 'ansible_search_path' from source: unknown 27885 1726882560.81935: calling self._execute() 27885 1726882560.81962: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882560.81976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882560.81997: variable 'omit' from source: magic vars 27885 1726882560.82432: variable 'ansible_distribution_major_version' from source: facts 27885 1726882560.82449: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882560.82461: variable 'omit' from source: magic vars 27885 1726882560.82518: variable 'omit' from source: magic vars 27885 1726882560.82621: variable 'interface' from source: set_fact 27885 1726882560.82648: variable 'omit' from source: magic vars 27885 1726882560.82709: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882560.82750: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882560.82774: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882560.82809: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882560.82909: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882560.82912: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882560.82916: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882560.82919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882560.82996: Set connection var ansible_pipelining to False 27885 1726882560.83009: Set connection var ansible_connection to ssh 27885 1726882560.83025: Set connection var ansible_timeout to 10 27885 1726882560.83035: Set connection var ansible_shell_type to sh 27885 1726882560.83126: Set connection var ansible_shell_executable to /bin/sh 27885 1726882560.83129: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882560.83132: variable 'ansible_shell_executable' from source: unknown 27885 1726882560.83134: variable 'ansible_connection' from source: unknown 27885 1726882560.83137: variable 'ansible_module_compression' from source: unknown 27885 1726882560.83139: variable 'ansible_shell_type' from source: unknown 27885 1726882560.83141: variable 'ansible_shell_executable' from source: unknown 27885 1726882560.83144: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882560.83147: variable 'ansible_pipelining' from source: unknown 27885 1726882560.83149: variable 'ansible_timeout' from source: unknown 27885 1726882560.83152: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882560.83282: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882560.83304: variable 'omit' from source: magic vars 27885 1726882560.83314: starting attempt loop 27885 1726882560.83321: running the handler 27885 1726882560.83347: _low_level_execute_command(): starting 27885 1726882560.83360: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27885 1726882560.84224: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882560.84246: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882560.84266: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882560.84285: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882560.84396: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882560.86039: stdout chunk (state=3): >>>/root <<< 27885 1726882560.86220: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882560.86224: stdout chunk (state=3): >>><<< 27885 1726882560.86226: stderr chunk (state=3): >>><<< 27885 1726882560.86254: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882560.86381: _low_level_execute_command(): starting 27885 1726882560.86388: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882560.8626833-29390-244347582778502 `" && echo ansible-tmp-1726882560.8626833-29390-244347582778502="` echo /root/.ansible/tmp/ansible-tmp-1726882560.8626833-29390-244347582778502 `" ) && sleep 0' 27885 1726882560.86985: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882560.87009: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882560.87096: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882560.88978: stdout chunk (state=3): >>>ansible-tmp-1726882560.8626833-29390-244347582778502=/root/.ansible/tmp/ansible-tmp-1726882560.8626833-29390-244347582778502 <<< 27885 1726882560.89076: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882560.89299: stderr chunk (state=3): >>><<< 27885 1726882560.89303: stdout chunk (state=3): >>><<< 27885 1726882560.89306: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882560.8626833-29390-244347582778502=/root/.ansible/tmp/ansible-tmp-1726882560.8626833-29390-244347582778502 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882560.89309: variable 'ansible_module_compression' from source: unknown 27885 1726882560.89310: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27885_1t3g5hz/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27885 1726882560.89313: variable 'ansible_facts' from source: unknown 27885 1726882560.89352: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882560.8626833-29390-244347582778502/AnsiballZ_command.py 27885 1726882560.89581: Sending initial data 27885 1726882560.89585: Sent initial data (156 bytes) 27885 1726882560.90091: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882560.90097: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882560.90104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882560.90204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882560.90269: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882560.90332: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882560.91870: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27885 1726882560.91969: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27885 1726882560.92043: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmp1lqkcg5e /root/.ansible/tmp/ansible-tmp-1726882560.8626833-29390-244347582778502/AnsiballZ_command.py <<< 27885 1726882560.92048: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882560.8626833-29390-244347582778502/AnsiballZ_command.py" <<< 27885 1726882560.92121: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmp1lqkcg5e" to remote "/root/.ansible/tmp/ansible-tmp-1726882560.8626833-29390-244347582778502/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882560.8626833-29390-244347582778502/AnsiballZ_command.py" <<< 27885 1726882560.92999: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882560.93017: stderr chunk (state=3): >>><<< 27885 1726882560.93024: stdout chunk (state=3): >>><<< 27885 1726882560.93057: done transferring module to remote 27885 1726882560.93060: _low_level_execute_command(): starting 27885 1726882560.93167: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882560.8626833-29390-244347582778502/ /root/.ansible/tmp/ansible-tmp-1726882560.8626833-29390-244347582778502/AnsiballZ_command.py && sleep 0' 27885 1726882560.93670: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882560.93679: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882560.93688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882560.93707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882560.93719: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882560.93812: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882560.93824: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882560.93834: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882560.93922: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882560.95800: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882560.95803: stdout chunk (state=3): >>><<< 27885 1726882560.95811: stderr chunk (state=3): >>><<< 27885 1726882560.95815: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882560.95817: _low_level_execute_command(): starting 27885 1726882560.95820: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882560.8626833-29390-244347582778502/AnsiballZ_command.py && sleep 0' 27885 1726882560.96496: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882560.96507: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882560.96537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882560.96573: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882560.96662: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882560.96683: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882560.96782: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882561.13062: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest1"], "start": "2024-09-20 21:36:01.115730", "end": "2024-09-20 21:36:01.127506", "delta": "0:00:00.011776", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27885 1726882561.15101: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 27885 1726882561.15105: stdout chunk (state=3): >>><<< 27885 1726882561.15107: stderr chunk (state=3): >>><<< 27885 1726882561.15111: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest1"], "start": "2024-09-20 21:36:01.115730", "end": "2024-09-20 21:36:01.127506", "delta": "0:00:00.011776", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 27885 1726882561.15114: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del ethtest1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882560.8626833-29390-244347582778502/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27885 1726882561.15117: _low_level_execute_command(): starting 27885 1726882561.15119: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882560.8626833-29390-244347582778502/ > /dev/null 2>&1 && sleep 0' 27885 1726882561.16515: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882561.16543: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882561.16822: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882561.16921: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882561.18773: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882561.18810: stdout chunk (state=3): >>><<< 27885 1726882561.19099: stderr chunk (state=3): >>><<< 27885 1726882561.19103: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882561.19105: handler run complete 27885 1726882561.19107: Evaluated conditional (False): False 27885 1726882561.19109: attempt loop complete, returning result 27885 1726882561.19111: _execute() done 27885 1726882561.19113: dumping result to json 27885 1726882561.19115: done dumping result, returning 27885 1726882561.19117: done running TaskExecutor() for managed_node2/TASK: Remove test interface if necessary [12673a56-9f93-3fa5-01be-000000000777] 27885 1726882561.19119: sending task result for task 12673a56-9f93-3fa5-01be-000000000777 ok: [managed_node2] => { "changed": false, "cmd": [ "ip", "link", "del", "ethtest1" ], "delta": "0:00:00.011776", "end": "2024-09-20 21:36:01.127506", "rc": 0, "start": "2024-09-20 21:36:01.115730" } 27885 1726882561.19288: no more pending results, returning what we have 27885 1726882561.19292: results queue empty 27885 1726882561.19295: checking for any_errors_fatal 27885 1726882561.19296: done checking for any_errors_fatal 27885 1726882561.19297: checking for max_fail_percentage 27885 1726882561.19299: done checking for max_fail_percentage 27885 1726882561.19300: checking to see if all hosts have failed and the running result is not ok 27885 1726882561.19301: done checking to see if all hosts have failed 27885 1726882561.19301: getting the remaining hosts for this loop 27885 1726882561.19303: done getting the remaining hosts for this loop 27885 1726882561.19307: getting the next task for host managed_node2 27885 1726882561.19316: done getting next task for host managed_node2 27885 1726882561.19320: ^ task is: TASK: Assert interface1 is absent 27885 1726882561.19324: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882561.19330: getting variables 27885 1726882561.19332: in VariableManager get_vars() 27885 1726882561.19374: Calling all_inventory to load vars for managed_node2 27885 1726882561.19377: Calling groups_inventory to load vars for managed_node2 27885 1726882561.19379: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882561.19497: Calling all_plugins_play to load vars for managed_node2 27885 1726882561.19502: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882561.19506: Calling groups_plugins_play to load vars for managed_node2 27885 1726882561.20301: done sending task result for task 12673a56-9f93-3fa5-01be-000000000777 27885 1726882561.20305: WORKER PROCESS EXITING 27885 1726882561.22036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882561.24303: done with get_vars() 27885 1726882561.24332: done getting variables TASK [Assert interface1 is absent] ********************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:153 Friday 20 September 2024 21:36:01 -0400 (0:00:00.435) 0:00:33.886 ****** 27885 1726882561.24441: entering _queue_task() for managed_node2/include_tasks 27885 1726882561.24933: worker is 1 (out of 1 available) 27885 1726882561.24945: exiting _queue_task() for managed_node2/include_tasks 27885 1726882561.24959: done queuing things up, now waiting for results queue to drain 27885 1726882561.24961: waiting for pending results... 27885 1726882561.25190: running TaskExecutor() for managed_node2/TASK: Assert interface1 is absent 27885 1726882561.25309: in run() - task 12673a56-9f93-3fa5-01be-0000000000b6 27885 1726882561.25329: variable 'ansible_search_path' from source: unknown 27885 1726882561.25369: calling self._execute() 27885 1726882561.25481: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882561.25554: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882561.25557: variable 'omit' from source: magic vars 27885 1726882561.26299: variable 'ansible_distribution_major_version' from source: facts 27885 1726882561.26302: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882561.26305: _execute() done 27885 1726882561.26307: dumping result to json 27885 1726882561.26308: done dumping result, returning 27885 1726882561.26310: done running TaskExecutor() for managed_node2/TASK: Assert interface1 is absent [12673a56-9f93-3fa5-01be-0000000000b6] 27885 1726882561.26312: sending task result for task 12673a56-9f93-3fa5-01be-0000000000b6 27885 1726882561.26371: done sending task result for task 12673a56-9f93-3fa5-01be-0000000000b6 27885 1726882561.26375: WORKER PROCESS EXITING 27885 1726882561.26404: no more pending results, returning what we have 27885 1726882561.26409: in VariableManager get_vars() 27885 1726882561.26626: Calling all_inventory to load vars for managed_node2 27885 1726882561.26630: Calling groups_inventory to load vars for managed_node2 27885 1726882561.26632: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882561.26642: Calling all_plugins_play to load vars for managed_node2 27885 1726882561.26645: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882561.26648: Calling groups_plugins_play to load vars for managed_node2 27885 1726882561.28252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882561.30981: done with get_vars() 27885 1726882561.31099: variable 'ansible_search_path' from source: unknown 27885 1726882561.31117: we have included files to process 27885 1726882561.31118: generating all_blocks data 27885 1726882561.31124: done generating all_blocks data 27885 1726882561.31131: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 27885 1726882561.31132: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 27885 1726882561.31135: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 27885 1726882561.31502: in VariableManager get_vars() 27885 1726882561.31528: done with get_vars() 27885 1726882561.31758: done processing included file 27885 1726882561.31760: iterating over new_blocks loaded from include file 27885 1726882561.31762: in VariableManager get_vars() 27885 1726882561.31839: done with get_vars() 27885 1726882561.31842: filtering new block on tags 27885 1726882561.31876: done filtering new block on tags 27885 1726882561.31878: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node2 27885 1726882561.31891: extending task lists for all hosts with included blocks 27885 1726882561.35021: done extending task lists 27885 1726882561.35023: done processing included files 27885 1726882561.35024: results queue empty 27885 1726882561.35025: checking for any_errors_fatal 27885 1726882561.35030: done checking for any_errors_fatal 27885 1726882561.35031: checking for max_fail_percentage 27885 1726882561.35032: done checking for max_fail_percentage 27885 1726882561.35032: checking to see if all hosts have failed and the running result is not ok 27885 1726882561.35033: done checking to see if all hosts have failed 27885 1726882561.35034: getting the remaining hosts for this loop 27885 1726882561.35035: done getting the remaining hosts for this loop 27885 1726882561.35038: getting the next task for host managed_node2 27885 1726882561.35042: done getting next task for host managed_node2 27885 1726882561.35045: ^ task is: TASK: Include the task 'get_interface_stat.yml' 27885 1726882561.35048: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882561.35050: getting variables 27885 1726882561.35051: in VariableManager get_vars() 27885 1726882561.35118: Calling all_inventory to load vars for managed_node2 27885 1726882561.35121: Calling groups_inventory to load vars for managed_node2 27885 1726882561.35123: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882561.35129: Calling all_plugins_play to load vars for managed_node2 27885 1726882561.35131: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882561.35134: Calling groups_plugins_play to load vars for managed_node2 27885 1726882561.38177: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882561.40565: done with get_vars() 27885 1726882561.40597: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 21:36:01 -0400 (0:00:00.162) 0:00:34.049 ****** 27885 1726882561.40684: entering _queue_task() for managed_node2/include_tasks 27885 1726882561.41184: worker is 1 (out of 1 available) 27885 1726882561.41198: exiting _queue_task() for managed_node2/include_tasks 27885 1726882561.41210: done queuing things up, now waiting for results queue to drain 27885 1726882561.41212: waiting for pending results... 27885 1726882561.41464: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 27885 1726882561.41901: in run() - task 12673a56-9f93-3fa5-01be-000000000816 27885 1726882561.41905: variable 'ansible_search_path' from source: unknown 27885 1726882561.41911: variable 'ansible_search_path' from source: unknown 27885 1726882561.41915: calling self._execute() 27885 1726882561.41917: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882561.41920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882561.41923: variable 'omit' from source: magic vars 27885 1726882561.42186: variable 'ansible_distribution_major_version' from source: facts 27885 1726882561.42203: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882561.42210: _execute() done 27885 1726882561.42213: dumping result to json 27885 1726882561.42216: done dumping result, returning 27885 1726882561.42224: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [12673a56-9f93-3fa5-01be-000000000816] 27885 1726882561.42230: sending task result for task 12673a56-9f93-3fa5-01be-000000000816 27885 1726882561.42330: done sending task result for task 12673a56-9f93-3fa5-01be-000000000816 27885 1726882561.42333: WORKER PROCESS EXITING 27885 1726882561.42369: no more pending results, returning what we have 27885 1726882561.42375: in VariableManager get_vars() 27885 1726882561.42432: Calling all_inventory to load vars for managed_node2 27885 1726882561.42436: Calling groups_inventory to load vars for managed_node2 27885 1726882561.42438: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882561.42452: Calling all_plugins_play to load vars for managed_node2 27885 1726882561.42455: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882561.42458: Calling groups_plugins_play to load vars for managed_node2 27885 1726882561.45007: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882561.47118: done with get_vars() 27885 1726882561.47147: variable 'ansible_search_path' from source: unknown 27885 1726882561.47149: variable 'ansible_search_path' from source: unknown 27885 1726882561.47190: we have included files to process 27885 1726882561.47192: generating all_blocks data 27885 1726882561.47195: done generating all_blocks data 27885 1726882561.47197: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 27885 1726882561.47198: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 27885 1726882561.47201: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 27885 1726882561.47398: done processing included file 27885 1726882561.47401: iterating over new_blocks loaded from include file 27885 1726882561.47402: in VariableManager get_vars() 27885 1726882561.47425: done with get_vars() 27885 1726882561.47427: filtering new block on tags 27885 1726882561.47453: done filtering new block on tags 27885 1726882561.47456: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 27885 1726882561.47461: extending task lists for all hosts with included blocks 27885 1726882561.47584: done extending task lists 27885 1726882561.47585: done processing included files 27885 1726882561.47586: results queue empty 27885 1726882561.47587: checking for any_errors_fatal 27885 1726882561.47590: done checking for any_errors_fatal 27885 1726882561.47591: checking for max_fail_percentage 27885 1726882561.47592: done checking for max_fail_percentage 27885 1726882561.47595: checking to see if all hosts have failed and the running result is not ok 27885 1726882561.47595: done checking to see if all hosts have failed 27885 1726882561.47596: getting the remaining hosts for this loop 27885 1726882561.47597: done getting the remaining hosts for this loop 27885 1726882561.47600: getting the next task for host managed_node2 27885 1726882561.47605: done getting next task for host managed_node2 27885 1726882561.47607: ^ task is: TASK: Get stat for interface {{ interface }} 27885 1726882561.47611: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882561.47614: getting variables 27885 1726882561.47615: in VariableManager get_vars() 27885 1726882561.47630: Calling all_inventory to load vars for managed_node2 27885 1726882561.47633: Calling groups_inventory to load vars for managed_node2 27885 1726882561.47635: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882561.47641: Calling all_plugins_play to load vars for managed_node2 27885 1726882561.47643: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882561.47646: Calling groups_plugins_play to load vars for managed_node2 27885 1726882561.49078: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882561.50729: done with get_vars() 27885 1726882561.50755: done getting variables 27885 1726882561.50926: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest1] ***************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:36:01 -0400 (0:00:00.102) 0:00:34.152 ****** 27885 1726882561.50958: entering _queue_task() for managed_node2/stat 27885 1726882561.51314: worker is 1 (out of 1 available) 27885 1726882561.51327: exiting _queue_task() for managed_node2/stat 27885 1726882561.51340: done queuing things up, now waiting for results queue to drain 27885 1726882561.51342: waiting for pending results... 27885 1726882561.51626: running TaskExecutor() for managed_node2/TASK: Get stat for interface ethtest1 27885 1726882561.51760: in run() - task 12673a56-9f93-3fa5-01be-0000000008bc 27885 1726882561.51806: variable 'ansible_search_path' from source: unknown 27885 1726882561.51819: variable 'ansible_search_path' from source: unknown 27885 1726882561.51871: calling self._execute() 27885 1726882561.51979: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882561.51992: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882561.52010: variable 'omit' from source: magic vars 27885 1726882561.52388: variable 'ansible_distribution_major_version' from source: facts 27885 1726882561.52408: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882561.52423: variable 'omit' from source: magic vars 27885 1726882561.52575: variable 'omit' from source: magic vars 27885 1726882561.52579: variable 'interface' from source: set_fact 27885 1726882561.52603: variable 'omit' from source: magic vars 27885 1726882561.52648: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882561.52691: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882561.52722: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882561.52745: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882561.52799: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882561.52804: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882561.52810: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882561.52821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882561.52930: Set connection var ansible_pipelining to False 27885 1726882561.52940: Set connection var ansible_connection to ssh 27885 1726882561.52949: Set connection var ansible_timeout to 10 27885 1726882561.52999: Set connection var ansible_shell_type to sh 27885 1726882561.53002: Set connection var ansible_shell_executable to /bin/sh 27885 1726882561.53004: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882561.53007: variable 'ansible_shell_executable' from source: unknown 27885 1726882561.53009: variable 'ansible_connection' from source: unknown 27885 1726882561.53012: variable 'ansible_module_compression' from source: unknown 27885 1726882561.53013: variable 'ansible_shell_type' from source: unknown 27885 1726882561.53024: variable 'ansible_shell_executable' from source: unknown 27885 1726882561.53033: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882561.53039: variable 'ansible_pipelining' from source: unknown 27885 1726882561.53045: variable 'ansible_timeout' from source: unknown 27885 1726882561.53050: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882561.53248: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 27885 1726882561.53299: variable 'omit' from source: magic vars 27885 1726882561.53303: starting attempt loop 27885 1726882561.53305: running the handler 27885 1726882561.53307: _low_level_execute_command(): starting 27885 1726882561.53309: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27885 1726882561.54584: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 27885 1726882561.54610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882561.54814: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882561.54829: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882561.54846: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882561.54947: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882561.56681: stdout chunk (state=3): >>>/root <<< 27885 1726882561.56833: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882561.56957: stderr chunk (state=3): >>><<< 27885 1726882561.56960: stdout chunk (state=3): >>><<< 27885 1726882561.56981: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882561.57025: _low_level_execute_command(): starting 27885 1726882561.57103: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882561.5701041-29419-8949621245083 `" && echo ansible-tmp-1726882561.5701041-29419-8949621245083="` echo /root/.ansible/tmp/ansible-tmp-1726882561.5701041-29419-8949621245083 `" ) && sleep 0' 27885 1726882561.57861: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882561.57911: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882561.57989: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882561.58011: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882561.58034: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882561.58121: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882561.60078: stdout chunk (state=3): >>>ansible-tmp-1726882561.5701041-29419-8949621245083=/root/.ansible/tmp/ansible-tmp-1726882561.5701041-29419-8949621245083 <<< 27885 1726882561.60275: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882561.60340: stderr chunk (state=3): >>><<< 27885 1726882561.60343: stdout chunk (state=3): >>><<< 27885 1726882561.60362: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882561.5701041-29419-8949621245083=/root/.ansible/tmp/ansible-tmp-1726882561.5701041-29419-8949621245083 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882561.60427: variable 'ansible_module_compression' from source: unknown 27885 1726882561.60591: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27885_1t3g5hz/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 27885 1726882561.60596: variable 'ansible_facts' from source: unknown 27885 1726882561.60651: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882561.5701041-29419-8949621245083/AnsiballZ_stat.py 27885 1726882561.60822: Sending initial data 27885 1726882561.60832: Sent initial data (151 bytes) 27885 1726882561.61419: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882561.61433: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882561.61448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882561.61515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882561.61572: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882561.61595: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882561.61618: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882561.61708: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882561.63384: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27885 1726882561.63446: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27885 1726882561.63525: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpjpzclzg_ /root/.ansible/tmp/ansible-tmp-1726882561.5701041-29419-8949621245083/AnsiballZ_stat.py <<< 27885 1726882561.63529: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882561.5701041-29419-8949621245083/AnsiballZ_stat.py" <<< 27885 1726882561.63601: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpjpzclzg_" to remote "/root/.ansible/tmp/ansible-tmp-1726882561.5701041-29419-8949621245083/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882561.5701041-29419-8949621245083/AnsiballZ_stat.py" <<< 27885 1726882561.64526: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882561.64530: stdout chunk (state=3): >>><<< 27885 1726882561.64532: stderr chunk (state=3): >>><<< 27885 1726882561.64541: done transferring module to remote 27885 1726882561.64565: _low_level_execute_command(): starting 27885 1726882561.64576: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882561.5701041-29419-8949621245083/ /root/.ansible/tmp/ansible-tmp-1726882561.5701041-29419-8949621245083/AnsiballZ_stat.py && sleep 0' 27885 1726882561.65617: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882561.65681: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882561.65838: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882561.65861: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882561.65957: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882561.67796: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882561.67805: stdout chunk (state=3): >>><<< 27885 1726882561.67807: stderr chunk (state=3): >>><<< 27885 1726882561.67826: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882561.67839: _low_level_execute_command(): starting 27885 1726882561.67851: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882561.5701041-29419-8949621245083/AnsiballZ_stat.py && sleep 0' 27885 1726882561.68460: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882561.68475: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882561.68490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882561.68514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882561.68538: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882561.68553: stderr chunk (state=3): >>>debug2: match not found <<< 27885 1726882561.68658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882561.68681: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882561.68791: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882561.83984: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 27885 1726882561.85383: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 27885 1726882561.85405: stderr chunk (state=3): >>><<< 27885 1726882561.85409: stdout chunk (state=3): >>><<< 27885 1726882561.85426: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 27885 1726882561.85448: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882561.5701041-29419-8949621245083/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27885 1726882561.85456: _low_level_execute_command(): starting 27885 1726882561.85461: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882561.5701041-29419-8949621245083/ > /dev/null 2>&1 && sleep 0' 27885 1726882561.85871: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882561.85878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882561.85899: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882561.85916: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882561.85921: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882561.85969: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882561.85974: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882561.85976: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882561.86038: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882561.87906: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882561.87926: stderr chunk (state=3): >>><<< 27885 1726882561.87929: stdout chunk (state=3): >>><<< 27885 1726882561.87941: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882561.87946: handler run complete 27885 1726882561.87962: attempt loop complete, returning result 27885 1726882561.87965: _execute() done 27885 1726882561.87967: dumping result to json 27885 1726882561.87969: done dumping result, returning 27885 1726882561.87978: done running TaskExecutor() for managed_node2/TASK: Get stat for interface ethtest1 [12673a56-9f93-3fa5-01be-0000000008bc] 27885 1726882561.87983: sending task result for task 12673a56-9f93-3fa5-01be-0000000008bc 27885 1726882561.88075: done sending task result for task 12673a56-9f93-3fa5-01be-0000000008bc 27885 1726882561.88078: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 27885 1726882561.88168: no more pending results, returning what we have 27885 1726882561.88172: results queue empty 27885 1726882561.88173: checking for any_errors_fatal 27885 1726882561.88175: done checking for any_errors_fatal 27885 1726882561.88176: checking for max_fail_percentage 27885 1726882561.88177: done checking for max_fail_percentage 27885 1726882561.88178: checking to see if all hosts have failed and the running result is not ok 27885 1726882561.88179: done checking to see if all hosts have failed 27885 1726882561.88180: getting the remaining hosts for this loop 27885 1726882561.88181: done getting the remaining hosts for this loop 27885 1726882561.88185: getting the next task for host managed_node2 27885 1726882561.88194: done getting next task for host managed_node2 27885 1726882561.88197: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 27885 1726882561.88201: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882561.88207: getting variables 27885 1726882561.88208: in VariableManager get_vars() 27885 1726882561.88248: Calling all_inventory to load vars for managed_node2 27885 1726882561.88251: Calling groups_inventory to load vars for managed_node2 27885 1726882561.88253: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882561.88262: Calling all_plugins_play to load vars for managed_node2 27885 1726882561.88265: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882561.88267: Calling groups_plugins_play to load vars for managed_node2 27885 1726882561.89703: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882561.91491: done with get_vars() 27885 1726882561.91519: done getting variables 27885 1726882561.91576: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27885 1726882561.91704: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'ethtest1'] ************************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 21:36:01 -0400 (0:00:00.407) 0:00:34.559 ****** 27885 1726882561.91740: entering _queue_task() for managed_node2/assert 27885 1726882561.92136: worker is 1 (out of 1 available) 27885 1726882561.92297: exiting _queue_task() for managed_node2/assert 27885 1726882561.92309: done queuing things up, now waiting for results queue to drain 27885 1726882561.92310: waiting for pending results... 27885 1726882561.92441: running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'ethtest1' 27885 1726882561.92648: in run() - task 12673a56-9f93-3fa5-01be-000000000817 27885 1726882561.92652: variable 'ansible_search_path' from source: unknown 27885 1726882561.92655: variable 'ansible_search_path' from source: unknown 27885 1726882561.92657: calling self._execute() 27885 1726882561.92731: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882561.92743: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882561.92765: variable 'omit' from source: magic vars 27885 1726882561.93144: variable 'ansible_distribution_major_version' from source: facts 27885 1726882561.93161: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882561.93171: variable 'omit' from source: magic vars 27885 1726882561.93229: variable 'omit' from source: magic vars 27885 1726882561.93335: variable 'interface' from source: set_fact 27885 1726882561.93360: variable 'omit' from source: magic vars 27885 1726882561.93415: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882561.93458: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882561.93484: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882561.93627: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882561.93630: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882561.93632: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882561.93634: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882561.93636: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882561.93705: Set connection var ansible_pipelining to False 27885 1726882561.93715: Set connection var ansible_connection to ssh 27885 1726882561.93728: Set connection var ansible_timeout to 10 27885 1726882561.93740: Set connection var ansible_shell_type to sh 27885 1726882561.93750: Set connection var ansible_shell_executable to /bin/sh 27885 1726882561.93759: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882561.93788: variable 'ansible_shell_executable' from source: unknown 27885 1726882561.93802: variable 'ansible_connection' from source: unknown 27885 1726882561.93846: variable 'ansible_module_compression' from source: unknown 27885 1726882561.93848: variable 'ansible_shell_type' from source: unknown 27885 1726882561.93851: variable 'ansible_shell_executable' from source: unknown 27885 1726882561.93852: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882561.93854: variable 'ansible_pipelining' from source: unknown 27885 1726882561.93856: variable 'ansible_timeout' from source: unknown 27885 1726882561.93858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882561.94007: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882561.94028: variable 'omit' from source: magic vars 27885 1726882561.94065: starting attempt loop 27885 1726882561.94069: running the handler 27885 1726882561.94214: variable 'interface_stat' from source: set_fact 27885 1726882561.94231: Evaluated conditional (not interface_stat.stat.exists): True 27885 1726882561.94284: handler run complete 27885 1726882561.94291: attempt loop complete, returning result 27885 1726882561.94295: _execute() done 27885 1726882561.94297: dumping result to json 27885 1726882561.94299: done dumping result, returning 27885 1726882561.94301: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'ethtest1' [12673a56-9f93-3fa5-01be-000000000817] 27885 1726882561.94303: sending task result for task 12673a56-9f93-3fa5-01be-000000000817 27885 1726882561.94523: done sending task result for task 12673a56-9f93-3fa5-01be-000000000817 27885 1726882561.94527: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 27885 1726882561.94580: no more pending results, returning what we have 27885 1726882561.94585: results queue empty 27885 1726882561.94589: checking for any_errors_fatal 27885 1726882561.94614: done checking for any_errors_fatal 27885 1726882561.94615: checking for max_fail_percentage 27885 1726882561.94617: done checking for max_fail_percentage 27885 1726882561.94618: checking to see if all hosts have failed and the running result is not ok 27885 1726882561.94619: done checking to see if all hosts have failed 27885 1726882561.94620: getting the remaining hosts for this loop 27885 1726882561.94622: done getting the remaining hosts for this loop 27885 1726882561.94626: getting the next task for host managed_node2 27885 1726882561.94636: done getting next task for host managed_node2 27885 1726882561.94639: ^ task is: TASK: Set interface0 27885 1726882561.94644: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882561.94650: getting variables 27885 1726882561.94652: in VariableManager get_vars() 27885 1726882561.94905: Calling all_inventory to load vars for managed_node2 27885 1726882561.94908: Calling groups_inventory to load vars for managed_node2 27885 1726882561.94910: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882561.94920: Calling all_plugins_play to load vars for managed_node2 27885 1726882561.94922: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882561.94925: Calling groups_plugins_play to load vars for managed_node2 27885 1726882561.96491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882561.98629: done with get_vars() 27885 1726882561.98654: done getting variables 27885 1726882561.98739: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set interface0] ********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:155 Friday 20 September 2024 21:36:01 -0400 (0:00:00.070) 0:00:34.630 ****** 27885 1726882561.98768: entering _queue_task() for managed_node2/set_fact 27885 1726882561.99223: worker is 1 (out of 1 available) 27885 1726882561.99236: exiting _queue_task() for managed_node2/set_fact 27885 1726882561.99249: done queuing things up, now waiting for results queue to drain 27885 1726882561.99251: waiting for pending results... 27885 1726882561.99478: running TaskExecutor() for managed_node2/TASK: Set interface0 27885 1726882561.99623: in run() - task 12673a56-9f93-3fa5-01be-0000000000b7 27885 1726882561.99628: variable 'ansible_search_path' from source: unknown 27885 1726882561.99631: calling self._execute() 27885 1726882561.99699: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882561.99706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882561.99715: variable 'omit' from source: magic vars 27885 1726882562.00069: variable 'ansible_distribution_major_version' from source: facts 27885 1726882562.00082: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882562.00165: variable 'omit' from source: magic vars 27885 1726882562.00169: variable 'omit' from source: magic vars 27885 1726882562.00172: variable 'interface0' from source: play vars 27885 1726882562.00224: variable 'interface0' from source: play vars 27885 1726882562.00243: variable 'omit' from source: magic vars 27885 1726882562.00284: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882562.00323: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882562.00345: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882562.00364: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882562.00383: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882562.00412: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882562.00416: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882562.00419: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882562.00603: Set connection var ansible_pipelining to False 27885 1726882562.00607: Set connection var ansible_connection to ssh 27885 1726882562.00609: Set connection var ansible_timeout to 10 27885 1726882562.00611: Set connection var ansible_shell_type to sh 27885 1726882562.00613: Set connection var ansible_shell_executable to /bin/sh 27885 1726882562.00615: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882562.00617: variable 'ansible_shell_executable' from source: unknown 27885 1726882562.00620: variable 'ansible_connection' from source: unknown 27885 1726882562.00622: variable 'ansible_module_compression' from source: unknown 27885 1726882562.00624: variable 'ansible_shell_type' from source: unknown 27885 1726882562.00626: variable 'ansible_shell_executable' from source: unknown 27885 1726882562.00628: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882562.00630: variable 'ansible_pipelining' from source: unknown 27885 1726882562.00632: variable 'ansible_timeout' from source: unknown 27885 1726882562.00634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882562.00713: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882562.00725: variable 'omit' from source: magic vars 27885 1726882562.00731: starting attempt loop 27885 1726882562.00734: running the handler 27885 1726882562.00744: handler run complete 27885 1726882562.00754: attempt loop complete, returning result 27885 1726882562.00757: _execute() done 27885 1726882562.00759: dumping result to json 27885 1726882562.00762: done dumping result, returning 27885 1726882562.00820: done running TaskExecutor() for managed_node2/TASK: Set interface0 [12673a56-9f93-3fa5-01be-0000000000b7] 27885 1726882562.00823: sending task result for task 12673a56-9f93-3fa5-01be-0000000000b7 27885 1726882562.00876: done sending task result for task 12673a56-9f93-3fa5-01be-0000000000b7 27885 1726882562.00879: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "interface": "ethtest0" }, "changed": false } 27885 1726882562.00970: no more pending results, returning what we have 27885 1726882562.00972: results queue empty 27885 1726882562.00974: checking for any_errors_fatal 27885 1726882562.00980: done checking for any_errors_fatal 27885 1726882562.00980: checking for max_fail_percentage 27885 1726882562.00982: done checking for max_fail_percentage 27885 1726882562.00982: checking to see if all hosts have failed and the running result is not ok 27885 1726882562.00983: done checking to see if all hosts have failed 27885 1726882562.00984: getting the remaining hosts for this loop 27885 1726882562.00985: done getting the remaining hosts for this loop 27885 1726882562.00991: getting the next task for host managed_node2 27885 1726882562.01002: done getting next task for host managed_node2 27885 1726882562.01005: ^ task is: TASK: Delete interface0 27885 1726882562.01009: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882562.01013: getting variables 27885 1726882562.01014: in VariableManager get_vars() 27885 1726882562.01045: Calling all_inventory to load vars for managed_node2 27885 1726882562.01047: Calling groups_inventory to load vars for managed_node2 27885 1726882562.01049: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882562.01057: Calling all_plugins_play to load vars for managed_node2 27885 1726882562.01060: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882562.01062: Calling groups_plugins_play to load vars for managed_node2 27885 1726882562.02336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882562.04038: done with get_vars() 27885 1726882562.04058: done getting variables TASK [Delete interface0] ******************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:158 Friday 20 September 2024 21:36:02 -0400 (0:00:00.053) 0:00:34.683 ****** 27885 1726882562.04146: entering _queue_task() for managed_node2/include_tasks 27885 1726882562.04461: worker is 1 (out of 1 available) 27885 1726882562.04472: exiting _queue_task() for managed_node2/include_tasks 27885 1726882562.04485: done queuing things up, now waiting for results queue to drain 27885 1726882562.04486: waiting for pending results... 27885 1726882562.04812: running TaskExecutor() for managed_node2/TASK: Delete interface0 27885 1726882562.04853: in run() - task 12673a56-9f93-3fa5-01be-0000000000b8 27885 1726882562.04901: variable 'ansible_search_path' from source: unknown 27885 1726882562.04905: calling self._execute() 27885 1726882562.04998: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882562.05006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882562.05013: variable 'omit' from source: magic vars 27885 1726882562.05598: variable 'ansible_distribution_major_version' from source: facts 27885 1726882562.05602: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882562.05604: _execute() done 27885 1726882562.05606: dumping result to json 27885 1726882562.05608: done dumping result, returning 27885 1726882562.05610: done running TaskExecutor() for managed_node2/TASK: Delete interface0 [12673a56-9f93-3fa5-01be-0000000000b8] 27885 1726882562.05612: sending task result for task 12673a56-9f93-3fa5-01be-0000000000b8 27885 1726882562.05668: done sending task result for task 12673a56-9f93-3fa5-01be-0000000000b8 27885 1726882562.05671: WORKER PROCESS EXITING 27885 1726882562.05711: no more pending results, returning what we have 27885 1726882562.05716: in VariableManager get_vars() 27885 1726882562.05755: Calling all_inventory to load vars for managed_node2 27885 1726882562.05758: Calling groups_inventory to load vars for managed_node2 27885 1726882562.05760: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882562.05769: Calling all_plugins_play to load vars for managed_node2 27885 1726882562.05772: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882562.05775: Calling groups_plugins_play to load vars for managed_node2 27885 1726882562.07092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882562.08691: done with get_vars() 27885 1726882562.08718: variable 'ansible_search_path' from source: unknown 27885 1726882562.08734: we have included files to process 27885 1726882562.08735: generating all_blocks data 27885 1726882562.08737: done generating all_blocks data 27885 1726882562.08742: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 27885 1726882562.08743: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 27885 1726882562.08745: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 27885 1726882562.08922: done processing included file 27885 1726882562.08924: iterating over new_blocks loaded from include file 27885 1726882562.08925: in VariableManager get_vars() 27885 1726882562.08944: done with get_vars() 27885 1726882562.08946: filtering new block on tags 27885 1726882562.08970: done filtering new block on tags 27885 1726882562.08973: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed_node2 27885 1726882562.08978: extending task lists for all hosts with included blocks 27885 1726882562.10328: done extending task lists 27885 1726882562.10329: done processing included files 27885 1726882562.10330: results queue empty 27885 1726882562.10331: checking for any_errors_fatal 27885 1726882562.10334: done checking for any_errors_fatal 27885 1726882562.10335: checking for max_fail_percentage 27885 1726882562.10336: done checking for max_fail_percentage 27885 1726882562.10336: checking to see if all hosts have failed and the running result is not ok 27885 1726882562.10337: done checking to see if all hosts have failed 27885 1726882562.10338: getting the remaining hosts for this loop 27885 1726882562.10339: done getting the remaining hosts for this loop 27885 1726882562.10341: getting the next task for host managed_node2 27885 1726882562.10345: done getting next task for host managed_node2 27885 1726882562.10347: ^ task is: TASK: Remove test interface if necessary 27885 1726882562.10351: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882562.10354: getting variables 27885 1726882562.10355: in VariableManager get_vars() 27885 1726882562.10370: Calling all_inventory to load vars for managed_node2 27885 1726882562.10372: Calling groups_inventory to load vars for managed_node2 27885 1726882562.10374: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882562.10380: Calling all_plugins_play to load vars for managed_node2 27885 1726882562.10382: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882562.10385: Calling groups_plugins_play to load vars for managed_node2 27885 1726882562.11641: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882562.13297: done with get_vars() 27885 1726882562.13320: done getting variables 27885 1726882562.13378: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Friday 20 September 2024 21:36:02 -0400 (0:00:00.092) 0:00:34.776 ****** 27885 1726882562.13416: entering _queue_task() for managed_node2/command 27885 1726882562.13818: worker is 1 (out of 1 available) 27885 1726882562.13832: exiting _queue_task() for managed_node2/command 27885 1726882562.13845: done queuing things up, now waiting for results queue to drain 27885 1726882562.13847: waiting for pending results... 27885 1726882562.14101: running TaskExecutor() for managed_node2/TASK: Remove test interface if necessary 27885 1726882562.14192: in run() - task 12673a56-9f93-3fa5-01be-0000000008da 27885 1726882562.14219: variable 'ansible_search_path' from source: unknown 27885 1726882562.14223: variable 'ansible_search_path' from source: unknown 27885 1726882562.14261: calling self._execute() 27885 1726882562.14348: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882562.14352: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882562.14356: variable 'omit' from source: magic vars 27885 1726882562.15056: variable 'ansible_distribution_major_version' from source: facts 27885 1726882562.15059: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882562.15062: variable 'omit' from source: magic vars 27885 1726882562.15319: variable 'omit' from source: magic vars 27885 1726882562.15349: variable 'interface' from source: set_fact 27885 1726882562.15376: variable 'omit' from source: magic vars 27885 1726882562.15429: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882562.15479: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882562.15510: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882562.15538: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882562.15562: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882562.15602: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882562.15612: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882562.15621: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882562.15734: Set connection var ansible_pipelining to False 27885 1726882562.15753: Set connection var ansible_connection to ssh 27885 1726882562.15759: Set connection var ansible_timeout to 10 27885 1726882562.15861: Set connection var ansible_shell_type to sh 27885 1726882562.15864: Set connection var ansible_shell_executable to /bin/sh 27885 1726882562.15866: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882562.15868: variable 'ansible_shell_executable' from source: unknown 27885 1726882562.15870: variable 'ansible_connection' from source: unknown 27885 1726882562.15874: variable 'ansible_module_compression' from source: unknown 27885 1726882562.15876: variable 'ansible_shell_type' from source: unknown 27885 1726882562.15878: variable 'ansible_shell_executable' from source: unknown 27885 1726882562.15880: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882562.15882: variable 'ansible_pipelining' from source: unknown 27885 1726882562.15884: variable 'ansible_timeout' from source: unknown 27885 1726882562.15888: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882562.16072: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882562.16076: variable 'omit' from source: magic vars 27885 1726882562.16079: starting attempt loop 27885 1726882562.16081: running the handler 27885 1726882562.16083: _low_level_execute_command(): starting 27885 1726882562.16085: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27885 1726882562.16814: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882562.16830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882562.16885: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882562.16891: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882562.16961: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882562.19348: stdout chunk (state=3): >>>/root <<< 27885 1726882562.19352: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882562.19354: stdout chunk (state=3): >>><<< 27885 1726882562.19357: stderr chunk (state=3): >>><<< 27885 1726882562.19361: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882562.19364: _low_level_execute_command(): starting 27885 1726882562.19367: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882562.1924033-29444-258599498194195 `" && echo ansible-tmp-1726882562.1924033-29444-258599498194195="` echo /root/.ansible/tmp/ansible-tmp-1726882562.1924033-29444-258599498194195 `" ) && sleep 0' 27885 1726882562.20124: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882562.20132: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882562.20142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882562.20156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882562.20174: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882562.20178: stderr chunk (state=3): >>>debug2: match not found <<< 27885 1726882562.20192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882562.20283: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27885 1726882562.20290: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 27885 1726882562.20292: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27885 1726882562.20296: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882562.20299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882562.20301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882562.20308: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882562.20311: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882562.20315: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882562.20333: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882562.20345: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882562.20433: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882562.22613: stdout chunk (state=3): >>>ansible-tmp-1726882562.1924033-29444-258599498194195=/root/.ansible/tmp/ansible-tmp-1726882562.1924033-29444-258599498194195 <<< 27885 1726882562.22616: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882562.22619: stdout chunk (state=3): >>><<< 27885 1726882562.22621: stderr chunk (state=3): >>><<< 27885 1726882562.22623: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882562.1924033-29444-258599498194195=/root/.ansible/tmp/ansible-tmp-1726882562.1924033-29444-258599498194195 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882562.22654: variable 'ansible_module_compression' from source: unknown 27885 1726882562.22701: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27885_1t3g5hz/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27885 1726882562.22735: variable 'ansible_facts' from source: unknown 27885 1726882562.23049: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882562.1924033-29444-258599498194195/AnsiballZ_command.py 27885 1726882562.23308: Sending initial data 27885 1726882562.23311: Sent initial data (156 bytes) 27885 1726882562.24252: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882562.24256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882562.24283: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882562.24292: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882562.24333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882562.24392: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882562.24403: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882562.24511: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882562.26248: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27885 1726882562.26316: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27885 1726882562.26376: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpqb__zhv8 /root/.ansible/tmp/ansible-tmp-1726882562.1924033-29444-258599498194195/AnsiballZ_command.py <<< 27885 1726882562.26456: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882562.1924033-29444-258599498194195/AnsiballZ_command.py" <<< 27885 1726882562.26474: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpqb__zhv8" to remote "/root/.ansible/tmp/ansible-tmp-1726882562.1924033-29444-258599498194195/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882562.1924033-29444-258599498194195/AnsiballZ_command.py" <<< 27885 1726882562.27701: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882562.27753: stderr chunk (state=3): >>><<< 27885 1726882562.27984: stdout chunk (state=3): >>><<< 27885 1726882562.27991: done transferring module to remote 27885 1726882562.28000: _low_level_execute_command(): starting 27885 1726882562.28003: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882562.1924033-29444-258599498194195/ /root/.ansible/tmp/ansible-tmp-1726882562.1924033-29444-258599498194195/AnsiballZ_command.py && sleep 0' 27885 1726882562.28582: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882562.28603: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882562.28678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882562.28731: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882562.28750: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882562.28769: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882562.29076: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882562.31098: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882562.31102: stdout chunk (state=3): >>><<< 27885 1726882562.31105: stderr chunk (state=3): >>><<< 27885 1726882562.31107: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882562.31110: _low_level_execute_command(): starting 27885 1726882562.31113: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882562.1924033-29444-258599498194195/AnsiballZ_command.py && sleep 0' 27885 1726882562.32105: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882562.32108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 27885 1726882562.32110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 27885 1726882562.32113: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882562.32232: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882562.32440: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882562.48351: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest0"], "start": "2024-09-20 21:36:02.473056", "end": "2024-09-20 21:36:02.481460", "delta": "0:00:00.008404", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27885 1726882562.49929: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 27885 1726882562.49933: stdout chunk (state=3): >>><<< 27885 1726882562.49935: stderr chunk (state=3): >>><<< 27885 1726882562.49938: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest0"], "start": "2024-09-20 21:36:02.473056", "end": "2024-09-20 21:36:02.481460", "delta": "0:00:00.008404", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 27885 1726882562.49953: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882562.1924033-29444-258599498194195/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27885 1726882562.50031: _low_level_execute_command(): starting 27885 1726882562.50035: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882562.1924033-29444-258599498194195/ > /dev/null 2>&1 && sleep 0' 27885 1726882562.50557: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882562.50567: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882562.50578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882562.50592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882562.50623: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 27885 1726882562.50701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 27885 1726882562.50711: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882562.50720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882562.50722: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882562.50743: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882562.50845: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882562.52677: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882562.52696: stderr chunk (state=3): >>><<< 27885 1726882562.52699: stdout chunk (state=3): >>><<< 27885 1726882562.52714: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882562.52719: handler run complete 27885 1726882562.52735: Evaluated conditional (False): False 27885 1726882562.52744: attempt loop complete, returning result 27885 1726882562.52747: _execute() done 27885 1726882562.52749: dumping result to json 27885 1726882562.52754: done dumping result, returning 27885 1726882562.52761: done running TaskExecutor() for managed_node2/TASK: Remove test interface if necessary [12673a56-9f93-3fa5-01be-0000000008da] 27885 1726882562.52767: sending task result for task 12673a56-9f93-3fa5-01be-0000000008da 27885 1726882562.52863: done sending task result for task 12673a56-9f93-3fa5-01be-0000000008da 27885 1726882562.52866: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ip", "link", "del", "ethtest0" ], "delta": "0:00:00.008404", "end": "2024-09-20 21:36:02.481460", "rc": 0, "start": "2024-09-20 21:36:02.473056" } 27885 1726882562.52956: no more pending results, returning what we have 27885 1726882562.52959: results queue empty 27885 1726882562.52960: checking for any_errors_fatal 27885 1726882562.52962: done checking for any_errors_fatal 27885 1726882562.52962: checking for max_fail_percentage 27885 1726882562.52964: done checking for max_fail_percentage 27885 1726882562.52965: checking to see if all hosts have failed and the running result is not ok 27885 1726882562.52965: done checking to see if all hosts have failed 27885 1726882562.52966: getting the remaining hosts for this loop 27885 1726882562.52968: done getting the remaining hosts for this loop 27885 1726882562.52972: getting the next task for host managed_node2 27885 1726882562.52980: done getting next task for host managed_node2 27885 1726882562.52983: ^ task is: TASK: Assert interface0 is absent 27885 1726882562.52987: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882562.52991: getting variables 27885 1726882562.52995: in VariableManager get_vars() 27885 1726882562.53035: Calling all_inventory to load vars for managed_node2 27885 1726882562.53038: Calling groups_inventory to load vars for managed_node2 27885 1726882562.53040: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882562.53051: Calling all_plugins_play to load vars for managed_node2 27885 1726882562.53053: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882562.53056: Calling groups_plugins_play to load vars for managed_node2 27885 1726882562.54322: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882562.55189: done with get_vars() 27885 1726882562.55208: done getting variables TASK [Assert interface0 is absent] ********************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:160 Friday 20 September 2024 21:36:02 -0400 (0:00:00.418) 0:00:35.195 ****** 27885 1726882562.55274: entering _queue_task() for managed_node2/include_tasks 27885 1726882562.55500: worker is 1 (out of 1 available) 27885 1726882562.55516: exiting _queue_task() for managed_node2/include_tasks 27885 1726882562.55529: done queuing things up, now waiting for results queue to drain 27885 1726882562.55531: waiting for pending results... 27885 1726882562.55708: running TaskExecutor() for managed_node2/TASK: Assert interface0 is absent 27885 1726882562.55779: in run() - task 12673a56-9f93-3fa5-01be-0000000000b9 27885 1726882562.55796: variable 'ansible_search_path' from source: unknown 27885 1726882562.55824: calling self._execute() 27885 1726882562.55915: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882562.55919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882562.55928: variable 'omit' from source: magic vars 27885 1726882562.56398: variable 'ansible_distribution_major_version' from source: facts 27885 1726882562.56402: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882562.56405: _execute() done 27885 1726882562.56407: dumping result to json 27885 1726882562.56410: done dumping result, returning 27885 1726882562.56412: done running TaskExecutor() for managed_node2/TASK: Assert interface0 is absent [12673a56-9f93-3fa5-01be-0000000000b9] 27885 1726882562.56414: sending task result for task 12673a56-9f93-3fa5-01be-0000000000b9 27885 1726882562.56480: done sending task result for task 12673a56-9f93-3fa5-01be-0000000000b9 27885 1726882562.56483: WORKER PROCESS EXITING 27885 1726882562.56513: no more pending results, returning what we have 27885 1726882562.56519: in VariableManager get_vars() 27885 1726882562.56564: Calling all_inventory to load vars for managed_node2 27885 1726882562.56567: Calling groups_inventory to load vars for managed_node2 27885 1726882562.56569: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882562.56583: Calling all_plugins_play to load vars for managed_node2 27885 1726882562.56586: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882562.56588: Calling groups_plugins_play to load vars for managed_node2 27885 1726882562.58098: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882562.59754: done with get_vars() 27885 1726882562.59775: variable 'ansible_search_path' from source: unknown 27885 1726882562.59792: we have included files to process 27885 1726882562.59795: generating all_blocks data 27885 1726882562.59797: done generating all_blocks data 27885 1726882562.59801: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 27885 1726882562.59802: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 27885 1726882562.59804: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 27885 1726882562.59917: in VariableManager get_vars() 27885 1726882562.59945: done with get_vars() 27885 1726882562.60069: done processing included file 27885 1726882562.60071: iterating over new_blocks loaded from include file 27885 1726882562.60073: in VariableManager get_vars() 27885 1726882562.60095: done with get_vars() 27885 1726882562.60097: filtering new block on tags 27885 1726882562.60128: done filtering new block on tags 27885 1726882562.60131: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node2 27885 1726882562.60136: extending task lists for all hosts with included blocks 27885 1726882562.61694: done extending task lists 27885 1726882562.61696: done processing included files 27885 1726882562.61697: results queue empty 27885 1726882562.61697: checking for any_errors_fatal 27885 1726882562.61701: done checking for any_errors_fatal 27885 1726882562.61702: checking for max_fail_percentage 27885 1726882562.61703: done checking for max_fail_percentage 27885 1726882562.61704: checking to see if all hosts have failed and the running result is not ok 27885 1726882562.61705: done checking to see if all hosts have failed 27885 1726882562.61705: getting the remaining hosts for this loop 27885 1726882562.61706: done getting the remaining hosts for this loop 27885 1726882562.61709: getting the next task for host managed_node2 27885 1726882562.61713: done getting next task for host managed_node2 27885 1726882562.61715: ^ task is: TASK: Include the task 'get_interface_stat.yml' 27885 1726882562.61718: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882562.61721: getting variables 27885 1726882562.61722: in VariableManager get_vars() 27885 1726882562.61736: Calling all_inventory to load vars for managed_node2 27885 1726882562.61739: Calling groups_inventory to load vars for managed_node2 27885 1726882562.61740: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882562.61747: Calling all_plugins_play to load vars for managed_node2 27885 1726882562.61749: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882562.61752: Calling groups_plugins_play to load vars for managed_node2 27885 1726882562.63116: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882562.64764: done with get_vars() 27885 1726882562.64797: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 21:36:02 -0400 (0:00:00.096) 0:00:35.291 ****** 27885 1726882562.64891: entering _queue_task() for managed_node2/include_tasks 27885 1726882562.65402: worker is 1 (out of 1 available) 27885 1726882562.65412: exiting _queue_task() for managed_node2/include_tasks 27885 1726882562.65425: done queuing things up, now waiting for results queue to drain 27885 1726882562.65426: waiting for pending results... 27885 1726882562.65667: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 27885 1726882562.65801: in run() - task 12673a56-9f93-3fa5-01be-000000000990 27885 1726882562.65870: variable 'ansible_search_path' from source: unknown 27885 1726882562.65873: variable 'ansible_search_path' from source: unknown 27885 1726882562.65876: calling self._execute() 27885 1726882562.65967: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882562.65980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882562.66003: variable 'omit' from source: magic vars 27885 1726882562.66422: variable 'ansible_distribution_major_version' from source: facts 27885 1726882562.66449: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882562.66498: _execute() done 27885 1726882562.66502: dumping result to json 27885 1726882562.66505: done dumping result, returning 27885 1726882562.66508: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [12673a56-9f93-3fa5-01be-000000000990] 27885 1726882562.66511: sending task result for task 12673a56-9f93-3fa5-01be-000000000990 27885 1726882562.66718: done sending task result for task 12673a56-9f93-3fa5-01be-000000000990 27885 1726882562.66722: WORKER PROCESS EXITING 27885 1726882562.66753: no more pending results, returning what we have 27885 1726882562.66759: in VariableManager get_vars() 27885 1726882562.66928: Calling all_inventory to load vars for managed_node2 27885 1726882562.66931: Calling groups_inventory to load vars for managed_node2 27885 1726882562.66934: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882562.66946: Calling all_plugins_play to load vars for managed_node2 27885 1726882562.66950: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882562.66953: Calling groups_plugins_play to load vars for managed_node2 27885 1726882562.68479: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882562.70150: done with get_vars() 27885 1726882562.70173: variable 'ansible_search_path' from source: unknown 27885 1726882562.70175: variable 'ansible_search_path' from source: unknown 27885 1726882562.70218: we have included files to process 27885 1726882562.70219: generating all_blocks data 27885 1726882562.70221: done generating all_blocks data 27885 1726882562.70222: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 27885 1726882562.70223: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 27885 1726882562.70225: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 27885 1726882562.70435: done processing included file 27885 1726882562.70437: iterating over new_blocks loaded from include file 27885 1726882562.70439: in VariableManager get_vars() 27885 1726882562.70464: done with get_vars() 27885 1726882562.70467: filtering new block on tags 27885 1726882562.70499: done filtering new block on tags 27885 1726882562.70501: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 27885 1726882562.70507: extending task lists for all hosts with included blocks 27885 1726882562.70647: done extending task lists 27885 1726882562.70648: done processing included files 27885 1726882562.70649: results queue empty 27885 1726882562.70650: checking for any_errors_fatal 27885 1726882562.70653: done checking for any_errors_fatal 27885 1726882562.70654: checking for max_fail_percentage 27885 1726882562.70655: done checking for max_fail_percentage 27885 1726882562.70655: checking to see if all hosts have failed and the running result is not ok 27885 1726882562.70656: done checking to see if all hosts have failed 27885 1726882562.70657: getting the remaining hosts for this loop 27885 1726882562.70658: done getting the remaining hosts for this loop 27885 1726882562.70660: getting the next task for host managed_node2 27885 1726882562.70670: done getting next task for host managed_node2 27885 1726882562.70672: ^ task is: TASK: Get stat for interface {{ interface }} 27885 1726882562.70676: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882562.70679: getting variables 27885 1726882562.70680: in VariableManager get_vars() 27885 1726882562.70698: Calling all_inventory to load vars for managed_node2 27885 1726882562.70701: Calling groups_inventory to load vars for managed_node2 27885 1726882562.70703: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882562.70708: Calling all_plugins_play to load vars for managed_node2 27885 1726882562.70711: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882562.70713: Calling groups_plugins_play to load vars for managed_node2 27885 1726882562.72018: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882562.73657: done with get_vars() 27885 1726882562.73683: done getting variables 27885 1726882562.73859: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest0] ***************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:36:02 -0400 (0:00:00.090) 0:00:35.381 ****** 27885 1726882562.73896: entering _queue_task() for managed_node2/stat 27885 1726882562.74418: worker is 1 (out of 1 available) 27885 1726882562.74428: exiting _queue_task() for managed_node2/stat 27885 1726882562.74439: done queuing things up, now waiting for results queue to drain 27885 1726882562.74440: waiting for pending results... 27885 1726882562.74597: running TaskExecutor() for managed_node2/TASK: Get stat for interface ethtest0 27885 1726882562.74778: in run() - task 12673a56-9f93-3fa5-01be-000000000a4d 27885 1726882562.74782: variable 'ansible_search_path' from source: unknown 27885 1726882562.74788: variable 'ansible_search_path' from source: unknown 27885 1726882562.74809: calling self._execute() 27885 1726882562.74912: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882562.74925: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882562.74939: variable 'omit' from source: magic vars 27885 1726882562.75322: variable 'ansible_distribution_major_version' from source: facts 27885 1726882562.75498: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882562.75502: variable 'omit' from source: magic vars 27885 1726882562.75505: variable 'omit' from source: magic vars 27885 1726882562.75507: variable 'interface' from source: set_fact 27885 1726882562.75526: variable 'omit' from source: magic vars 27885 1726882562.75569: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882562.75634: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882562.75659: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882562.75679: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882562.75701: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882562.75741: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882562.75750: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882562.75757: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882562.75865: Set connection var ansible_pipelining to False 27885 1726882562.75875: Set connection var ansible_connection to ssh 27885 1726882562.75884: Set connection var ansible_timeout to 10 27885 1726882562.75895: Set connection var ansible_shell_type to sh 27885 1726882562.75905: Set connection var ansible_shell_executable to /bin/sh 27885 1726882562.75925: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882562.75972: variable 'ansible_shell_executable' from source: unknown 27885 1726882562.76067: variable 'ansible_connection' from source: unknown 27885 1726882562.76071: variable 'ansible_module_compression' from source: unknown 27885 1726882562.76073: variable 'ansible_shell_type' from source: unknown 27885 1726882562.76075: variable 'ansible_shell_executable' from source: unknown 27885 1726882562.76077: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882562.76079: variable 'ansible_pipelining' from source: unknown 27885 1726882562.76081: variable 'ansible_timeout' from source: unknown 27885 1726882562.76083: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882562.76257: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 27885 1726882562.76282: variable 'omit' from source: magic vars 27885 1726882562.76299: starting attempt loop 27885 1726882562.76307: running the handler 27885 1726882562.76395: _low_level_execute_command(): starting 27885 1726882562.76399: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27885 1726882562.77079: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882562.77120: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882562.77133: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882562.77222: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882562.78929: stdout chunk (state=3): >>>/root <<< 27885 1726882562.79084: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882562.79090: stdout chunk (state=3): >>><<< 27885 1726882562.79094: stderr chunk (state=3): >>><<< 27885 1726882562.79207: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882562.79210: _low_level_execute_command(): starting 27885 1726882562.79214: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882562.7911723-29474-126019871215132 `" && echo ansible-tmp-1726882562.7911723-29474-126019871215132="` echo /root/.ansible/tmp/ansible-tmp-1726882562.7911723-29474-126019871215132 `" ) && sleep 0' 27885 1726882562.79745: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882562.79749: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882562.79752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882562.79757: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882562.79777: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882562.79845: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882562.79894: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882562.79950: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882562.81834: stdout chunk (state=3): >>>ansible-tmp-1726882562.7911723-29474-126019871215132=/root/.ansible/tmp/ansible-tmp-1726882562.7911723-29474-126019871215132 <<< 27885 1726882562.81944: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882562.81965: stderr chunk (state=3): >>><<< 27885 1726882562.81968: stdout chunk (state=3): >>><<< 27885 1726882562.81982: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882562.7911723-29474-126019871215132=/root/.ansible/tmp/ansible-tmp-1726882562.7911723-29474-126019871215132 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882562.82020: variable 'ansible_module_compression' from source: unknown 27885 1726882562.82063: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27885_1t3g5hz/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 27885 1726882562.82095: variable 'ansible_facts' from source: unknown 27885 1726882562.82157: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882562.7911723-29474-126019871215132/AnsiballZ_stat.py 27885 1726882562.82384: Sending initial data 27885 1726882562.82396: Sent initial data (153 bytes) 27885 1726882562.82904: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882562.82907: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882562.82909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882562.82911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882562.82949: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882562.82966: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882562.83044: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882562.84570: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 27885 1726882562.84580: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 27885 1726882562.84584: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 27885 1726882562.84591: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27885 1726882562.84665: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27885 1726882562.84726: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmp00d_r75i /root/.ansible/tmp/ansible-tmp-1726882562.7911723-29474-126019871215132/AnsiballZ_stat.py <<< 27885 1726882562.84729: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882562.7911723-29474-126019871215132/AnsiballZ_stat.py" <<< 27885 1726882562.84786: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmp00d_r75i" to remote "/root/.ansible/tmp/ansible-tmp-1726882562.7911723-29474-126019871215132/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882562.7911723-29474-126019871215132/AnsiballZ_stat.py" <<< 27885 1726882562.85397: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882562.85439: stderr chunk (state=3): >>><<< 27885 1726882562.85442: stdout chunk (state=3): >>><<< 27885 1726882562.85458: done transferring module to remote 27885 1726882562.85467: _low_level_execute_command(): starting 27885 1726882562.85471: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882562.7911723-29474-126019871215132/ /root/.ansible/tmp/ansible-tmp-1726882562.7911723-29474-126019871215132/AnsiballZ_stat.py && sleep 0' 27885 1726882562.85887: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882562.85906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882562.85910: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882562.85921: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882562.85977: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882562.85980: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882562.86043: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882562.87761: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882562.87790: stderr chunk (state=3): >>><<< 27885 1726882562.87795: stdout chunk (state=3): >>><<< 27885 1726882562.87806: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882562.87809: _low_level_execute_command(): starting 27885 1726882562.87814: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882562.7911723-29474-126019871215132/AnsiballZ_stat.py && sleep 0' 27885 1726882562.88246: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882562.88249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882562.88252: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882562.88254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882562.88299: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882562.88303: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882562.88319: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882562.88383: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882563.03466: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 27885 1726882563.04484: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 27885 1726882563.04491: stdout chunk (state=3): >>><<< 27885 1726882563.04495: stderr chunk (state=3): >>><<< 27885 1726882563.04515: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 27885 1726882563.04549: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882562.7911723-29474-126019871215132/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27885 1726882563.04595: _low_level_execute_command(): starting 27885 1726882563.04599: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882562.7911723-29474-126019871215132/ > /dev/null 2>&1 && sleep 0' 27885 1726882563.05212: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882563.05227: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882563.05240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882563.05267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882563.05370: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882563.05383: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882563.05404: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882563.05423: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882563.05522: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882563.07391: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882563.07397: stdout chunk (state=3): >>><<< 27885 1726882563.07400: stderr chunk (state=3): >>><<< 27885 1726882563.07417: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882563.07423: handler run complete 27885 1726882563.07602: attempt loop complete, returning result 27885 1726882563.07605: _execute() done 27885 1726882563.07607: dumping result to json 27885 1726882563.07609: done dumping result, returning 27885 1726882563.07610: done running TaskExecutor() for managed_node2/TASK: Get stat for interface ethtest0 [12673a56-9f93-3fa5-01be-000000000a4d] 27885 1726882563.07612: sending task result for task 12673a56-9f93-3fa5-01be-000000000a4d 27885 1726882563.07677: done sending task result for task 12673a56-9f93-3fa5-01be-000000000a4d 27885 1726882563.07679: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 27885 1726882563.07767: no more pending results, returning what we have 27885 1726882563.07771: results queue empty 27885 1726882563.07772: checking for any_errors_fatal 27885 1726882563.07774: done checking for any_errors_fatal 27885 1726882563.07774: checking for max_fail_percentage 27885 1726882563.07776: done checking for max_fail_percentage 27885 1726882563.07777: checking to see if all hosts have failed and the running result is not ok 27885 1726882563.07778: done checking to see if all hosts have failed 27885 1726882563.07778: getting the remaining hosts for this loop 27885 1726882563.07780: done getting the remaining hosts for this loop 27885 1726882563.07784: getting the next task for host managed_node2 27885 1726882563.07794: done getting next task for host managed_node2 27885 1726882563.07798: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 27885 1726882563.07802: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882563.07808: getting variables 27885 1726882563.07810: in VariableManager get_vars() 27885 1726882563.07941: Calling all_inventory to load vars for managed_node2 27885 1726882563.07944: Calling groups_inventory to load vars for managed_node2 27885 1726882563.07947: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882563.07956: Calling all_plugins_play to load vars for managed_node2 27885 1726882563.07959: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882563.07962: Calling groups_plugins_play to load vars for managed_node2 27885 1726882563.09411: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882563.11013: done with get_vars() 27885 1726882563.11038: done getting variables 27885 1726882563.11111: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27885 1726882563.11243: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'ethtest0'] ************************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 21:36:03 -0400 (0:00:00.373) 0:00:35.755 ****** 27885 1726882563.11275: entering _queue_task() for managed_node2/assert 27885 1726882563.11822: worker is 1 (out of 1 available) 27885 1726882563.11832: exiting _queue_task() for managed_node2/assert 27885 1726882563.11843: done queuing things up, now waiting for results queue to drain 27885 1726882563.11844: waiting for pending results... 27885 1726882563.11981: running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'ethtest0' 27885 1726882563.12157: in run() - task 12673a56-9f93-3fa5-01be-000000000991 27885 1726882563.12161: variable 'ansible_search_path' from source: unknown 27885 1726882563.12165: variable 'ansible_search_path' from source: unknown 27885 1726882563.12502: calling self._execute() 27885 1726882563.12506: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882563.12509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882563.12513: variable 'omit' from source: magic vars 27885 1726882563.12627: variable 'ansible_distribution_major_version' from source: facts 27885 1726882563.12644: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882563.12798: variable 'omit' from source: magic vars 27885 1726882563.12801: variable 'omit' from source: magic vars 27885 1726882563.12803: variable 'interface' from source: set_fact 27885 1726882563.12806: variable 'omit' from source: magic vars 27885 1726882563.12840: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882563.12880: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882563.12901: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882563.12918: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882563.12929: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882563.12967: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882563.12971: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882563.12973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882563.13078: Set connection var ansible_pipelining to False 27885 1726882563.13081: Set connection var ansible_connection to ssh 27885 1726882563.13090: Set connection var ansible_timeout to 10 27885 1726882563.13095: Set connection var ansible_shell_type to sh 27885 1726882563.13097: Set connection var ansible_shell_executable to /bin/sh 27885 1726882563.13103: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882563.13126: variable 'ansible_shell_executable' from source: unknown 27885 1726882563.13129: variable 'ansible_connection' from source: unknown 27885 1726882563.13132: variable 'ansible_module_compression' from source: unknown 27885 1726882563.13134: variable 'ansible_shell_type' from source: unknown 27885 1726882563.13136: variable 'ansible_shell_executable' from source: unknown 27885 1726882563.13138: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882563.13142: variable 'ansible_pipelining' from source: unknown 27885 1726882563.13145: variable 'ansible_timeout' from source: unknown 27885 1726882563.13149: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882563.13296: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882563.13307: variable 'omit' from source: magic vars 27885 1726882563.13312: starting attempt loop 27885 1726882563.13315: running the handler 27885 1726882563.13463: variable 'interface_stat' from source: set_fact 27885 1726882563.13474: Evaluated conditional (not interface_stat.stat.exists): True 27885 1726882563.13479: handler run complete 27885 1726882563.13503: attempt loop complete, returning result 27885 1726882563.13506: _execute() done 27885 1726882563.13508: dumping result to json 27885 1726882563.13511: done dumping result, returning 27885 1726882563.13518: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'ethtest0' [12673a56-9f93-3fa5-01be-000000000991] 27885 1726882563.13523: sending task result for task 12673a56-9f93-3fa5-01be-000000000991 27885 1726882563.13613: done sending task result for task 12673a56-9f93-3fa5-01be-000000000991 27885 1726882563.13617: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 27885 1726882563.13667: no more pending results, returning what we have 27885 1726882563.13671: results queue empty 27885 1726882563.13672: checking for any_errors_fatal 27885 1726882563.13683: done checking for any_errors_fatal 27885 1726882563.13683: checking for max_fail_percentage 27885 1726882563.13685: done checking for max_fail_percentage 27885 1726882563.13686: checking to see if all hosts have failed and the running result is not ok 27885 1726882563.13687: done checking to see if all hosts have failed 27885 1726882563.13688: getting the remaining hosts for this loop 27885 1726882563.13689: done getting the remaining hosts for this loop 27885 1726882563.13797: getting the next task for host managed_node2 27885 1726882563.13808: done getting next task for host managed_node2 27885 1726882563.13812: ^ task is: TASK: Assert interface0 profile and interface1 profile are absent 27885 1726882563.13816: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882563.13822: getting variables 27885 1726882563.13823: in VariableManager get_vars() 27885 1726882563.13865: Calling all_inventory to load vars for managed_node2 27885 1726882563.13868: Calling groups_inventory to load vars for managed_node2 27885 1726882563.13871: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882563.13881: Calling all_plugins_play to load vars for managed_node2 27885 1726882563.13885: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882563.13887: Calling groups_plugins_play to load vars for managed_node2 27885 1726882563.15530: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882563.17156: done with get_vars() 27885 1726882563.17187: done getting variables TASK [Assert interface0 profile and interface1 profile are absent] ************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:162 Friday 20 September 2024 21:36:03 -0400 (0:00:00.060) 0:00:35.815 ****** 27885 1726882563.17291: entering _queue_task() for managed_node2/include_tasks 27885 1726882563.17665: worker is 1 (out of 1 available) 27885 1726882563.17678: exiting _queue_task() for managed_node2/include_tasks 27885 1726882563.17798: done queuing things up, now waiting for results queue to drain 27885 1726882563.17803: waiting for pending results... 27885 1726882563.18112: running TaskExecutor() for managed_node2/TASK: Assert interface0 profile and interface1 profile are absent 27885 1726882563.18117: in run() - task 12673a56-9f93-3fa5-01be-0000000000ba 27885 1726882563.18122: variable 'ansible_search_path' from source: unknown 27885 1726882563.18170: variable 'interface0' from source: play vars 27885 1726882563.18385: variable 'interface0' from source: play vars 27885 1726882563.18402: variable 'interface1' from source: play vars 27885 1726882563.18474: variable 'interface1' from source: play vars 27885 1726882563.18489: variable 'omit' from source: magic vars 27885 1726882563.18698: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882563.18703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882563.18706: variable 'omit' from source: magic vars 27885 1726882563.18901: variable 'ansible_distribution_major_version' from source: facts 27885 1726882563.18912: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882563.18945: variable 'item' from source: unknown 27885 1726882563.19013: variable 'item' from source: unknown 27885 1726882563.19163: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882563.19168: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882563.19171: variable 'omit' from source: magic vars 27885 1726882563.19499: variable 'ansible_distribution_major_version' from source: facts 27885 1726882563.19506: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882563.19508: variable 'item' from source: unknown 27885 1726882563.19510: variable 'item' from source: unknown 27885 1726882563.19556: dumping result to json 27885 1726882563.19559: done dumping result, returning 27885 1726882563.19560: done running TaskExecutor() for managed_node2/TASK: Assert interface0 profile and interface1 profile are absent [12673a56-9f93-3fa5-01be-0000000000ba] 27885 1726882563.19562: sending task result for task 12673a56-9f93-3fa5-01be-0000000000ba 27885 1726882563.19600: done sending task result for task 12673a56-9f93-3fa5-01be-0000000000ba 27885 1726882563.19603: WORKER PROCESS EXITING 27885 1726882563.19640: no more pending results, returning what we have 27885 1726882563.19645: in VariableManager get_vars() 27885 1726882563.19742: Calling all_inventory to load vars for managed_node2 27885 1726882563.19745: Calling groups_inventory to load vars for managed_node2 27885 1726882563.19747: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882563.19800: Calling all_plugins_play to load vars for managed_node2 27885 1726882563.19803: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882563.19806: Calling groups_plugins_play to load vars for managed_node2 27885 1726882563.21273: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882563.26794: done with get_vars() 27885 1726882563.26811: variable 'ansible_search_path' from source: unknown 27885 1726882563.26825: variable 'ansible_search_path' from source: unknown 27885 1726882563.26829: we have included files to process 27885 1726882563.26830: generating all_blocks data 27885 1726882563.26831: done generating all_blocks data 27885 1726882563.26832: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 27885 1726882563.26833: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 27885 1726882563.26834: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 27885 1726882563.26930: in VariableManager get_vars() 27885 1726882563.26946: done with get_vars() 27885 1726882563.27021: done processing included file 27885 1726882563.27022: iterating over new_blocks loaded from include file 27885 1726882563.27023: in VariableManager get_vars() 27885 1726882563.27034: done with get_vars() 27885 1726882563.27035: filtering new block on tags 27885 1726882563.27055: done filtering new block on tags 27885 1726882563.27057: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node2 => (item=ethtest0) 27885 1726882563.27061: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 27885 1726882563.27062: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 27885 1726882563.27064: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 27885 1726882563.27116: in VariableManager get_vars() 27885 1726882563.27130: done with get_vars() 27885 1726882563.27189: done processing included file 27885 1726882563.27191: iterating over new_blocks loaded from include file 27885 1726882563.27191: in VariableManager get_vars() 27885 1726882563.27204: done with get_vars() 27885 1726882563.27205: filtering new block on tags 27885 1726882563.27223: done filtering new block on tags 27885 1726882563.27224: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node2 => (item=ethtest1) 27885 1726882563.27227: extending task lists for all hosts with included blocks 27885 1726882563.28629: done extending task lists 27885 1726882563.28631: done processing included files 27885 1726882563.28632: results queue empty 27885 1726882563.28633: checking for any_errors_fatal 27885 1726882563.28636: done checking for any_errors_fatal 27885 1726882563.28637: checking for max_fail_percentage 27885 1726882563.28638: done checking for max_fail_percentage 27885 1726882563.28639: checking to see if all hosts have failed and the running result is not ok 27885 1726882563.28639: done checking to see if all hosts have failed 27885 1726882563.28640: getting the remaining hosts for this loop 27885 1726882563.28641: done getting the remaining hosts for this loop 27885 1726882563.28644: getting the next task for host managed_node2 27885 1726882563.28647: done getting next task for host managed_node2 27885 1726882563.28650: ^ task is: TASK: Include the task 'get_profile_stat.yml' 27885 1726882563.28653: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882563.28655: getting variables 27885 1726882563.28656: in VariableManager get_vars() 27885 1726882563.28672: Calling all_inventory to load vars for managed_node2 27885 1726882563.28674: Calling groups_inventory to load vars for managed_node2 27885 1726882563.28681: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882563.28689: Calling all_plugins_play to load vars for managed_node2 27885 1726882563.28692: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882563.28696: Calling groups_plugins_play to load vars for managed_node2 27885 1726882563.29574: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882563.30518: done with get_vars() 27885 1726882563.30538: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Friday 20 September 2024 21:36:03 -0400 (0:00:00.133) 0:00:35.948 ****** 27885 1726882563.30617: entering _queue_task() for managed_node2/include_tasks 27885 1726882563.31400: worker is 1 (out of 1 available) 27885 1726882563.31412: exiting _queue_task() for managed_node2/include_tasks 27885 1726882563.31425: done queuing things up, now waiting for results queue to drain 27885 1726882563.31426: waiting for pending results... 27885 1726882563.31821: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 27885 1726882563.31854: in run() - task 12673a56-9f93-3fa5-01be-000000000a6c 27885 1726882563.31868: variable 'ansible_search_path' from source: unknown 27885 1726882563.31872: variable 'ansible_search_path' from source: unknown 27885 1726882563.31913: calling self._execute() 27885 1726882563.32019: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882563.32023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882563.32073: variable 'omit' from source: magic vars 27885 1726882563.32415: variable 'ansible_distribution_major_version' from source: facts 27885 1726882563.32489: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882563.32495: _execute() done 27885 1726882563.32498: dumping result to json 27885 1726882563.32500: done dumping result, returning 27885 1726882563.32501: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [12673a56-9f93-3fa5-01be-000000000a6c] 27885 1726882563.32503: sending task result for task 12673a56-9f93-3fa5-01be-000000000a6c 27885 1726882563.32561: done sending task result for task 12673a56-9f93-3fa5-01be-000000000a6c 27885 1726882563.32563: WORKER PROCESS EXITING 27885 1726882563.32595: no more pending results, returning what we have 27885 1726882563.32601: in VariableManager get_vars() 27885 1726882563.32645: Calling all_inventory to load vars for managed_node2 27885 1726882563.32648: Calling groups_inventory to load vars for managed_node2 27885 1726882563.32650: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882563.32661: Calling all_plugins_play to load vars for managed_node2 27885 1726882563.32664: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882563.32666: Calling groups_plugins_play to load vars for managed_node2 27885 1726882563.34432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882563.36290: done with get_vars() 27885 1726882563.36313: variable 'ansible_search_path' from source: unknown 27885 1726882563.36314: variable 'ansible_search_path' from source: unknown 27885 1726882563.36351: we have included files to process 27885 1726882563.36358: generating all_blocks data 27885 1726882563.36360: done generating all_blocks data 27885 1726882563.36361: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 27885 1726882563.36362: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 27885 1726882563.36365: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 27885 1726882563.37502: done processing included file 27885 1726882563.37504: iterating over new_blocks loaded from include file 27885 1726882563.37505: in VariableManager get_vars() 27885 1726882563.37529: done with get_vars() 27885 1726882563.37531: filtering new block on tags 27885 1726882563.37658: done filtering new block on tags 27885 1726882563.37661: in VariableManager get_vars() 27885 1726882563.37681: done with get_vars() 27885 1726882563.37682: filtering new block on tags 27885 1726882563.37736: done filtering new block on tags 27885 1726882563.37738: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 27885 1726882563.37743: extending task lists for all hosts with included blocks 27885 1726882563.37866: done extending task lists 27885 1726882563.37867: done processing included files 27885 1726882563.37868: results queue empty 27885 1726882563.37869: checking for any_errors_fatal 27885 1726882563.37877: done checking for any_errors_fatal 27885 1726882563.37878: checking for max_fail_percentage 27885 1726882563.37879: done checking for max_fail_percentage 27885 1726882563.37880: checking to see if all hosts have failed and the running result is not ok 27885 1726882563.37881: done checking to see if all hosts have failed 27885 1726882563.37882: getting the remaining hosts for this loop 27885 1726882563.37883: done getting the remaining hosts for this loop 27885 1726882563.37885: getting the next task for host managed_node2 27885 1726882563.37889: done getting next task for host managed_node2 27885 1726882563.37891: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 27885 1726882563.37896: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882563.37899: getting variables 27885 1726882563.37899: in VariableManager get_vars() 27885 1726882563.37912: Calling all_inventory to load vars for managed_node2 27885 1726882563.37914: Calling groups_inventory to load vars for managed_node2 27885 1726882563.37916: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882563.37921: Calling all_plugins_play to load vars for managed_node2 27885 1726882563.37924: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882563.37926: Calling groups_plugins_play to load vars for managed_node2 27885 1726882563.39095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882563.40752: done with get_vars() 27885 1726882563.40774: done getting variables 27885 1726882563.40826: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:36:03 -0400 (0:00:00.102) 0:00:36.051 ****** 27885 1726882563.40862: entering _queue_task() for managed_node2/set_fact 27885 1726882563.41500: worker is 1 (out of 1 available) 27885 1726882563.41507: exiting _queue_task() for managed_node2/set_fact 27885 1726882563.41518: done queuing things up, now waiting for results queue to drain 27885 1726882563.41519: waiting for pending results... 27885 1726882563.41649: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 27885 1726882563.41747: in run() - task 12673a56-9f93-3fa5-01be-000000000b3c 27885 1726882563.41751: variable 'ansible_search_path' from source: unknown 27885 1726882563.41754: variable 'ansible_search_path' from source: unknown 27885 1726882563.41758: calling self._execute() 27885 1726882563.41851: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882563.41867: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882563.41879: variable 'omit' from source: magic vars 27885 1726882563.42259: variable 'ansible_distribution_major_version' from source: facts 27885 1726882563.42276: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882563.42297: variable 'omit' from source: magic vars 27885 1726882563.42400: variable 'omit' from source: magic vars 27885 1726882563.42403: variable 'omit' from source: magic vars 27885 1726882563.42438: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882563.42478: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882563.42510: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882563.42531: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882563.42547: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882563.42582: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882563.42591: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882563.42600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882563.42726: Set connection var ansible_pipelining to False 27885 1726882563.42729: Set connection var ansible_connection to ssh 27885 1726882563.42732: Set connection var ansible_timeout to 10 27885 1726882563.42798: Set connection var ansible_shell_type to sh 27885 1726882563.42801: Set connection var ansible_shell_executable to /bin/sh 27885 1726882563.42803: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882563.42805: variable 'ansible_shell_executable' from source: unknown 27885 1726882563.42807: variable 'ansible_connection' from source: unknown 27885 1726882563.42809: variable 'ansible_module_compression' from source: unknown 27885 1726882563.42811: variable 'ansible_shell_type' from source: unknown 27885 1726882563.42814: variable 'ansible_shell_executable' from source: unknown 27885 1726882563.42816: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882563.42818: variable 'ansible_pipelining' from source: unknown 27885 1726882563.42820: variable 'ansible_timeout' from source: unknown 27885 1726882563.42821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882563.42965: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882563.42981: variable 'omit' from source: magic vars 27885 1726882563.42992: starting attempt loop 27885 1726882563.43001: running the handler 27885 1726882563.43017: handler run complete 27885 1726882563.43031: attempt loop complete, returning result 27885 1726882563.43037: _execute() done 27885 1726882563.43158: dumping result to json 27885 1726882563.43163: done dumping result, returning 27885 1726882563.43166: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [12673a56-9f93-3fa5-01be-000000000b3c] 27885 1726882563.43168: sending task result for task 12673a56-9f93-3fa5-01be-000000000b3c 27885 1726882563.43240: done sending task result for task 12673a56-9f93-3fa5-01be-000000000b3c 27885 1726882563.43244: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 27885 1726882563.43305: no more pending results, returning what we have 27885 1726882563.43309: results queue empty 27885 1726882563.43310: checking for any_errors_fatal 27885 1726882563.43312: done checking for any_errors_fatal 27885 1726882563.43313: checking for max_fail_percentage 27885 1726882563.43315: done checking for max_fail_percentage 27885 1726882563.43316: checking to see if all hosts have failed and the running result is not ok 27885 1726882563.43317: done checking to see if all hosts have failed 27885 1726882563.43317: getting the remaining hosts for this loop 27885 1726882563.43320: done getting the remaining hosts for this loop 27885 1726882563.43324: getting the next task for host managed_node2 27885 1726882563.43333: done getting next task for host managed_node2 27885 1726882563.43336: ^ task is: TASK: Stat profile file 27885 1726882563.43343: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882563.43347: getting variables 27885 1726882563.43349: in VariableManager get_vars() 27885 1726882563.43391: Calling all_inventory to load vars for managed_node2 27885 1726882563.43396: Calling groups_inventory to load vars for managed_node2 27885 1726882563.43399: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882563.43410: Calling all_plugins_play to load vars for managed_node2 27885 1726882563.43413: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882563.43416: Calling groups_plugins_play to load vars for managed_node2 27885 1726882563.45061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882563.47760: done with get_vars() 27885 1726882563.47803: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:36:03 -0400 (0:00:00.070) 0:00:36.121 ****** 27885 1726882563.48001: entering _queue_task() for managed_node2/stat 27885 1726882563.48610: worker is 1 (out of 1 available) 27885 1726882563.48623: exiting _queue_task() for managed_node2/stat 27885 1726882563.48638: done queuing things up, now waiting for results queue to drain 27885 1726882563.48639: waiting for pending results... 27885 1726882563.49217: running TaskExecutor() for managed_node2/TASK: Stat profile file 27885 1726882563.49469: in run() - task 12673a56-9f93-3fa5-01be-000000000b3d 27885 1726882563.49475: variable 'ansible_search_path' from source: unknown 27885 1726882563.49478: variable 'ansible_search_path' from source: unknown 27885 1726882563.49578: calling self._execute() 27885 1726882563.49756: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882563.49769: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882563.49784: variable 'omit' from source: magic vars 27885 1726882563.50663: variable 'ansible_distribution_major_version' from source: facts 27885 1726882563.50669: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882563.50681: variable 'omit' from source: magic vars 27885 1726882563.50795: variable 'omit' from source: magic vars 27885 1726882563.51072: variable 'profile' from source: include params 27885 1726882563.51199: variable 'item' from source: include params 27885 1726882563.51203: variable 'item' from source: include params 27885 1726882563.51319: variable 'omit' from source: magic vars 27885 1726882563.51343: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882563.51422: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882563.51537: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882563.51541: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882563.51645: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882563.51649: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882563.51651: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882563.51653: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882563.51821: Set connection var ansible_pipelining to False 27885 1726882563.51828: Set connection var ansible_connection to ssh 27885 1726882563.51838: Set connection var ansible_timeout to 10 27885 1726882563.51845: Set connection var ansible_shell_type to sh 27885 1726882563.51907: Set connection var ansible_shell_executable to /bin/sh 27885 1726882563.51917: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882563.51950: variable 'ansible_shell_executable' from source: unknown 27885 1726882563.51959: variable 'ansible_connection' from source: unknown 27885 1726882563.51971: variable 'ansible_module_compression' from source: unknown 27885 1726882563.51978: variable 'ansible_shell_type' from source: unknown 27885 1726882563.51986: variable 'ansible_shell_executable' from source: unknown 27885 1726882563.52035: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882563.52038: variable 'ansible_pipelining' from source: unknown 27885 1726882563.52041: variable 'ansible_timeout' from source: unknown 27885 1726882563.52043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882563.52240: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 27885 1726882563.52265: variable 'omit' from source: magic vars 27885 1726882563.52277: starting attempt loop 27885 1726882563.52285: running the handler 27885 1726882563.52362: _low_level_execute_command(): starting 27885 1726882563.52365: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27885 1726882563.53143: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882563.53198: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882563.53218: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882563.53245: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882563.53341: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882563.55174: stdout chunk (state=3): >>>/root <<< 27885 1726882563.55455: stdout chunk (state=3): >>><<< 27885 1726882563.55459: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882563.55462: stderr chunk (state=3): >>><<< 27885 1726882563.55464: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882563.55467: _low_level_execute_command(): starting 27885 1726882563.55470: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882563.5534685-29506-87685202284992 `" && echo ansible-tmp-1726882563.5534685-29506-87685202284992="` echo /root/.ansible/tmp/ansible-tmp-1726882563.5534685-29506-87685202284992 `" ) && sleep 0' 27885 1726882563.56548: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882563.56573: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882563.56590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882563.56611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882563.56708: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882563.56774: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882563.56830: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882563.56924: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882563.58807: stdout chunk (state=3): >>>ansible-tmp-1726882563.5534685-29506-87685202284992=/root/.ansible/tmp/ansible-tmp-1726882563.5534685-29506-87685202284992 <<< 27885 1726882563.59011: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882563.59014: stdout chunk (state=3): >>><<< 27885 1726882563.59016: stderr chunk (state=3): >>><<< 27885 1726882563.59034: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882563.5534685-29506-87685202284992=/root/.ansible/tmp/ansible-tmp-1726882563.5534685-29506-87685202284992 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882563.59499: variable 'ansible_module_compression' from source: unknown 27885 1726882563.59502: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27885_1t3g5hz/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 27885 1726882563.59504: variable 'ansible_facts' from source: unknown 27885 1726882563.59957: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882563.5534685-29506-87685202284992/AnsiballZ_stat.py 27885 1726882563.60433: Sending initial data 27885 1726882563.60444: Sent initial data (152 bytes) 27885 1726882563.61481: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882563.61485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882563.61488: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882563.61490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882563.61799: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882563.62043: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882563.63427: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27885 1726882563.63530: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27885 1726882563.63588: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmp_l98uril /root/.ansible/tmp/ansible-tmp-1726882563.5534685-29506-87685202284992/AnsiballZ_stat.py <<< 27885 1726882563.63602: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882563.5534685-29506-87685202284992/AnsiballZ_stat.py" <<< 27885 1726882563.63687: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmp_l98uril" to remote "/root/.ansible/tmp/ansible-tmp-1726882563.5534685-29506-87685202284992/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882563.5534685-29506-87685202284992/AnsiballZ_stat.py" <<< 27885 1726882563.65058: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882563.65138: stderr chunk (state=3): >>><<< 27885 1726882563.65141: stdout chunk (state=3): >>><<< 27885 1726882563.65148: done transferring module to remote 27885 1726882563.65161: _low_level_execute_command(): starting 27885 1726882563.65246: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882563.5534685-29506-87685202284992/ /root/.ansible/tmp/ansible-tmp-1726882563.5534685-29506-87685202284992/AnsiballZ_stat.py && sleep 0' 27885 1726882563.66322: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882563.66452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882563.66618: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882563.66705: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882563.68434: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882563.68464: stderr chunk (state=3): >>><<< 27885 1726882563.68472: stdout chunk (state=3): >>><<< 27885 1726882563.68519: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882563.68531: _low_level_execute_command(): starting 27885 1726882563.68680: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882563.5534685-29506-87685202284992/AnsiballZ_stat.py && sleep 0' 27885 1726882563.69706: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882563.69739: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882563.69754: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882563.69763: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882563.70130: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882563.85019: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 27885 1726882563.86299: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 27885 1726882563.86303: stdout chunk (state=3): >>><<< 27885 1726882563.86306: stderr chunk (state=3): >>><<< 27885 1726882563.86308: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 27885 1726882563.86311: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882563.5534685-29506-87685202284992/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27885 1726882563.86314: _low_level_execute_command(): starting 27885 1726882563.86316: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882563.5534685-29506-87685202284992/ > /dev/null 2>&1 && sleep 0' 27885 1726882563.86975: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882563.87299: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882563.87302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882563.87305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882563.87307: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882563.87310: stderr chunk (state=3): >>>debug2: match not found <<< 27885 1726882563.87311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882563.87313: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27885 1726882563.87315: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 27885 1726882563.87317: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27885 1726882563.87320: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882563.87322: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882563.87324: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882563.87326: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882563.89121: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882563.89153: stderr chunk (state=3): >>><<< 27885 1726882563.89156: stdout chunk (state=3): >>><<< 27885 1726882563.89173: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882563.89179: handler run complete 27885 1726882563.89209: attempt loop complete, returning result 27885 1726882563.89212: _execute() done 27885 1726882563.89216: dumping result to json 27885 1726882563.89224: done dumping result, returning 27885 1726882563.89235: done running TaskExecutor() for managed_node2/TASK: Stat profile file [12673a56-9f93-3fa5-01be-000000000b3d] 27885 1726882563.89240: sending task result for task 12673a56-9f93-3fa5-01be-000000000b3d ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 27885 1726882563.89410: no more pending results, returning what we have 27885 1726882563.89414: results queue empty 27885 1726882563.89415: checking for any_errors_fatal 27885 1726882563.89421: done checking for any_errors_fatal 27885 1726882563.89421: checking for max_fail_percentage 27885 1726882563.89423: done checking for max_fail_percentage 27885 1726882563.89424: checking to see if all hosts have failed and the running result is not ok 27885 1726882563.89424: done checking to see if all hosts have failed 27885 1726882563.89425: getting the remaining hosts for this loop 27885 1726882563.89427: done getting the remaining hosts for this loop 27885 1726882563.89434: getting the next task for host managed_node2 27885 1726882563.89441: done getting next task for host managed_node2 27885 1726882563.89443: ^ task is: TASK: Set NM profile exist flag based on the profile files 27885 1726882563.89449: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882563.89453: getting variables 27885 1726882563.89455: in VariableManager get_vars() 27885 1726882563.89504: Calling all_inventory to load vars for managed_node2 27885 1726882563.89507: Calling groups_inventory to load vars for managed_node2 27885 1726882563.89509: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882563.89524: Calling all_plugins_play to load vars for managed_node2 27885 1726882563.89526: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882563.89529: Calling groups_plugins_play to load vars for managed_node2 27885 1726882563.90104: done sending task result for task 12673a56-9f93-3fa5-01be-000000000b3d 27885 1726882563.90108: WORKER PROCESS EXITING 27885 1726882563.90603: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882563.92381: done with get_vars() 27885 1726882563.92412: done getting variables 27885 1726882563.92477: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:36:03 -0400 (0:00:00.446) 0:00:36.567 ****** 27885 1726882563.92514: entering _queue_task() for managed_node2/set_fact 27885 1726882563.92850: worker is 1 (out of 1 available) 27885 1726882563.92862: exiting _queue_task() for managed_node2/set_fact 27885 1726882563.92877: done queuing things up, now waiting for results queue to drain 27885 1726882563.92878: waiting for pending results... 27885 1726882563.93138: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 27885 1726882563.93227: in run() - task 12673a56-9f93-3fa5-01be-000000000b3e 27885 1726882563.93239: variable 'ansible_search_path' from source: unknown 27885 1726882563.93242: variable 'ansible_search_path' from source: unknown 27885 1726882563.93270: calling self._execute() 27885 1726882563.93359: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882563.93364: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882563.93372: variable 'omit' from source: magic vars 27885 1726882563.93657: variable 'ansible_distribution_major_version' from source: facts 27885 1726882563.93668: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882563.93756: variable 'profile_stat' from source: set_fact 27885 1726882563.93764: Evaluated conditional (profile_stat.stat.exists): False 27885 1726882563.93768: when evaluation is False, skipping this task 27885 1726882563.93771: _execute() done 27885 1726882563.93773: dumping result to json 27885 1726882563.93776: done dumping result, returning 27885 1726882563.93783: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [12673a56-9f93-3fa5-01be-000000000b3e] 27885 1726882563.93788: sending task result for task 12673a56-9f93-3fa5-01be-000000000b3e 27885 1726882563.93872: done sending task result for task 12673a56-9f93-3fa5-01be-000000000b3e 27885 1726882563.93875: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 27885 1726882563.93922: no more pending results, returning what we have 27885 1726882563.93926: results queue empty 27885 1726882563.93927: checking for any_errors_fatal 27885 1726882563.93934: done checking for any_errors_fatal 27885 1726882563.93934: checking for max_fail_percentage 27885 1726882563.93936: done checking for max_fail_percentage 27885 1726882563.93937: checking to see if all hosts have failed and the running result is not ok 27885 1726882563.93938: done checking to see if all hosts have failed 27885 1726882563.93938: getting the remaining hosts for this loop 27885 1726882563.93940: done getting the remaining hosts for this loop 27885 1726882563.93943: getting the next task for host managed_node2 27885 1726882563.93951: done getting next task for host managed_node2 27885 1726882563.93954: ^ task is: TASK: Get NM profile info 27885 1726882563.93960: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882563.93964: getting variables 27885 1726882563.93966: in VariableManager get_vars() 27885 1726882563.94008: Calling all_inventory to load vars for managed_node2 27885 1726882563.94010: Calling groups_inventory to load vars for managed_node2 27885 1726882563.94012: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882563.94022: Calling all_plugins_play to load vars for managed_node2 27885 1726882563.94024: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882563.94027: Calling groups_plugins_play to load vars for managed_node2 27885 1726882563.94925: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882563.96141: done with get_vars() 27885 1726882563.96156: done getting variables 27885 1726882563.96226: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:36:03 -0400 (0:00:00.037) 0:00:36.604 ****** 27885 1726882563.96247: entering _queue_task() for managed_node2/shell 27885 1726882563.96248: Creating lock for shell 27885 1726882563.96467: worker is 1 (out of 1 available) 27885 1726882563.96481: exiting _queue_task() for managed_node2/shell 27885 1726882563.96501: done queuing things up, now waiting for results queue to drain 27885 1726882563.96503: waiting for pending results... 27885 1726882563.96666: running TaskExecutor() for managed_node2/TASK: Get NM profile info 27885 1726882563.96745: in run() - task 12673a56-9f93-3fa5-01be-000000000b3f 27885 1726882563.96757: variable 'ansible_search_path' from source: unknown 27885 1726882563.96761: variable 'ansible_search_path' from source: unknown 27885 1726882563.96790: calling self._execute() 27885 1726882563.96879: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882563.96885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882563.96898: variable 'omit' from source: magic vars 27885 1726882563.97168: variable 'ansible_distribution_major_version' from source: facts 27885 1726882563.97177: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882563.97182: variable 'omit' from source: magic vars 27885 1726882563.97220: variable 'omit' from source: magic vars 27885 1726882563.97292: variable 'profile' from source: include params 27885 1726882563.97299: variable 'item' from source: include params 27885 1726882563.97351: variable 'item' from source: include params 27885 1726882563.97365: variable 'omit' from source: magic vars 27885 1726882563.97403: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882563.97429: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882563.97444: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882563.97457: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882563.97467: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882563.97496: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882563.97499: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882563.97502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882563.97567: Set connection var ansible_pipelining to False 27885 1726882563.97571: Set connection var ansible_connection to ssh 27885 1726882563.97576: Set connection var ansible_timeout to 10 27885 1726882563.97579: Set connection var ansible_shell_type to sh 27885 1726882563.97583: Set connection var ansible_shell_executable to /bin/sh 27885 1726882563.97591: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882563.97614: variable 'ansible_shell_executable' from source: unknown 27885 1726882563.97617: variable 'ansible_connection' from source: unknown 27885 1726882563.97620: variable 'ansible_module_compression' from source: unknown 27885 1726882563.97622: variable 'ansible_shell_type' from source: unknown 27885 1726882563.97624: variable 'ansible_shell_executable' from source: unknown 27885 1726882563.97626: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882563.97628: variable 'ansible_pipelining' from source: unknown 27885 1726882563.97634: variable 'ansible_timeout' from source: unknown 27885 1726882563.97636: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882563.97735: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882563.97743: variable 'omit' from source: magic vars 27885 1726882563.97749: starting attempt loop 27885 1726882563.97752: running the handler 27885 1726882563.97761: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882563.97777: _low_level_execute_command(): starting 27885 1726882563.97784: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27885 1726882563.98262: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882563.98302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882563.98306: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration <<< 27885 1726882563.98309: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882563.98311: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882563.98354: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882563.98357: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882563.98360: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882563.98429: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882564.00012: stdout chunk (state=3): >>>/root <<< 27885 1726882564.00114: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882564.00150: stderr chunk (state=3): >>><<< 27885 1726882564.00153: stdout chunk (state=3): >>><<< 27885 1726882564.00200: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882564.00203: _low_level_execute_command(): starting 27885 1726882564.00207: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882564.0016928-29536-165259049314659 `" && echo ansible-tmp-1726882564.0016928-29536-165259049314659="` echo /root/.ansible/tmp/ansible-tmp-1726882564.0016928-29536-165259049314659 `" ) && sleep 0' 27885 1726882564.00647: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882564.00652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 27885 1726882564.00656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882564.00660: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882564.00662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882564.00711: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882564.00714: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882564.00783: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882564.02639: stdout chunk (state=3): >>>ansible-tmp-1726882564.0016928-29536-165259049314659=/root/.ansible/tmp/ansible-tmp-1726882564.0016928-29536-165259049314659 <<< 27885 1726882564.02748: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882564.02777: stderr chunk (state=3): >>><<< 27885 1726882564.02780: stdout chunk (state=3): >>><<< 27885 1726882564.02799: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882564.0016928-29536-165259049314659=/root/.ansible/tmp/ansible-tmp-1726882564.0016928-29536-165259049314659 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882564.02830: variable 'ansible_module_compression' from source: unknown 27885 1726882564.02873: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27885_1t3g5hz/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27885 1726882564.02906: variable 'ansible_facts' from source: unknown 27885 1726882564.02964: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882564.0016928-29536-165259049314659/AnsiballZ_command.py 27885 1726882564.03070: Sending initial data 27885 1726882564.03073: Sent initial data (156 bytes) 27885 1726882564.03522: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882564.03526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882564.03528: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration <<< 27885 1726882564.03530: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882564.03532: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882564.03584: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882564.03591: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882564.03596: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882564.03647: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882564.05168: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 27885 1726882564.05175: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27885 1726882564.05226: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27885 1726882564.05285: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpe87bxyp_ /root/.ansible/tmp/ansible-tmp-1726882564.0016928-29536-165259049314659/AnsiballZ_command.py <<< 27885 1726882564.05290: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882564.0016928-29536-165259049314659/AnsiballZ_command.py" <<< 27885 1726882564.05346: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpe87bxyp_" to remote "/root/.ansible/tmp/ansible-tmp-1726882564.0016928-29536-165259049314659/AnsiballZ_command.py" <<< 27885 1726882564.05349: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882564.0016928-29536-165259049314659/AnsiballZ_command.py" <<< 27885 1726882564.05947: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882564.05997: stderr chunk (state=3): >>><<< 27885 1726882564.06000: stdout chunk (state=3): >>><<< 27885 1726882564.06019: done transferring module to remote 27885 1726882564.06028: _low_level_execute_command(): starting 27885 1726882564.06034: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882564.0016928-29536-165259049314659/ /root/.ansible/tmp/ansible-tmp-1726882564.0016928-29536-165259049314659/AnsiballZ_command.py && sleep 0' 27885 1726882564.06472: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882564.06480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 27885 1726882564.06482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882564.06484: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882564.06489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882564.06540: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882564.06543: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882564.06548: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882564.06606: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882564.08341: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882564.08363: stderr chunk (state=3): >>><<< 27885 1726882564.08366: stdout chunk (state=3): >>><<< 27885 1726882564.08380: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882564.08383: _low_level_execute_command(): starting 27885 1726882564.08390: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882564.0016928-29536-165259049314659/AnsiballZ_command.py && sleep 0' 27885 1726882564.08843: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882564.08846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 27885 1726882564.08849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882564.08851: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882564.08853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882564.08909: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882564.08914: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882564.08978: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882564.25438: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "start": "2024-09-20 21:36:04.236789", "end": "2024-09-20 21:36:04.253523", "delta": "0:00:00.016734", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27885 1726882564.26828: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.14.69 closed. <<< 27885 1726882564.26850: stderr chunk (state=3): >>><<< 27885 1726882564.26853: stdout chunk (state=3): >>><<< 27885 1726882564.26870: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "start": "2024-09-20 21:36:04.236789", "end": "2024-09-20 21:36:04.253523", "delta": "0:00:00.016734", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.14.69 closed. 27885 1726882564.26908: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882564.0016928-29536-165259049314659/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27885 1726882564.26916: _low_level_execute_command(): starting 27885 1726882564.26919: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882564.0016928-29536-165259049314659/ > /dev/null 2>&1 && sleep 0' 27885 1726882564.27490: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882564.27512: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882564.27545: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882564.27637: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882564.29414: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882564.29433: stderr chunk (state=3): >>><<< 27885 1726882564.29436: stdout chunk (state=3): >>><<< 27885 1726882564.29447: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882564.29454: handler run complete 27885 1726882564.29474: Evaluated conditional (False): False 27885 1726882564.29483: attempt loop complete, returning result 27885 1726882564.29485: _execute() done 27885 1726882564.29491: dumping result to json 27885 1726882564.29498: done dumping result, returning 27885 1726882564.29505: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [12673a56-9f93-3fa5-01be-000000000b3f] 27885 1726882564.29510: sending task result for task 12673a56-9f93-3fa5-01be-000000000b3f 27885 1726882564.29602: done sending task result for task 12673a56-9f93-3fa5-01be-000000000b3f 27885 1726882564.29605: WORKER PROCESS EXITING fatal: [managed_node2]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "delta": "0:00:00.016734", "end": "2024-09-20 21:36:04.253523", "rc": 1, "start": "2024-09-20 21:36:04.236789" } MSG: non-zero return code ...ignoring 27885 1726882564.29673: no more pending results, returning what we have 27885 1726882564.29676: results queue empty 27885 1726882564.29677: checking for any_errors_fatal 27885 1726882564.29685: done checking for any_errors_fatal 27885 1726882564.29685: checking for max_fail_percentage 27885 1726882564.29687: done checking for max_fail_percentage 27885 1726882564.29688: checking to see if all hosts have failed and the running result is not ok 27885 1726882564.29689: done checking to see if all hosts have failed 27885 1726882564.29689: getting the remaining hosts for this loop 27885 1726882564.29691: done getting the remaining hosts for this loop 27885 1726882564.29700: getting the next task for host managed_node2 27885 1726882564.29707: done getting next task for host managed_node2 27885 1726882564.29710: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 27885 1726882564.29717: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882564.29721: getting variables 27885 1726882564.29723: in VariableManager get_vars() 27885 1726882564.29764: Calling all_inventory to load vars for managed_node2 27885 1726882564.29766: Calling groups_inventory to load vars for managed_node2 27885 1726882564.29769: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882564.29778: Calling all_plugins_play to load vars for managed_node2 27885 1726882564.29780: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882564.29782: Calling groups_plugins_play to load vars for managed_node2 27885 1726882564.31084: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882564.32018: done with get_vars() 27885 1726882564.32035: done getting variables 27885 1726882564.32075: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:36:04 -0400 (0:00:00.358) 0:00:36.963 ****** 27885 1726882564.32102: entering _queue_task() for managed_node2/set_fact 27885 1726882564.32340: worker is 1 (out of 1 available) 27885 1726882564.32355: exiting _queue_task() for managed_node2/set_fact 27885 1726882564.32369: done queuing things up, now waiting for results queue to drain 27885 1726882564.32370: waiting for pending results... 27885 1726882564.32545: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 27885 1726882564.32626: in run() - task 12673a56-9f93-3fa5-01be-000000000b40 27885 1726882564.32637: variable 'ansible_search_path' from source: unknown 27885 1726882564.32641: variable 'ansible_search_path' from source: unknown 27885 1726882564.32670: calling self._execute() 27885 1726882564.32751: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882564.32755: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882564.32763: variable 'omit' from source: magic vars 27885 1726882564.33052: variable 'ansible_distribution_major_version' from source: facts 27885 1726882564.33062: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882564.33157: variable 'nm_profile_exists' from source: set_fact 27885 1726882564.33168: Evaluated conditional (nm_profile_exists.rc == 0): False 27885 1726882564.33171: when evaluation is False, skipping this task 27885 1726882564.33174: _execute() done 27885 1726882564.33176: dumping result to json 27885 1726882564.33179: done dumping result, returning 27885 1726882564.33189: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [12673a56-9f93-3fa5-01be-000000000b40] 27885 1726882564.33192: sending task result for task 12673a56-9f93-3fa5-01be-000000000b40 27885 1726882564.33277: done sending task result for task 12673a56-9f93-3fa5-01be-000000000b40 27885 1726882564.33280: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 27885 1726882564.33327: no more pending results, returning what we have 27885 1726882564.33331: results queue empty 27885 1726882564.33332: checking for any_errors_fatal 27885 1726882564.33343: done checking for any_errors_fatal 27885 1726882564.33344: checking for max_fail_percentage 27885 1726882564.33345: done checking for max_fail_percentage 27885 1726882564.33346: checking to see if all hosts have failed and the running result is not ok 27885 1726882564.33347: done checking to see if all hosts have failed 27885 1726882564.33348: getting the remaining hosts for this loop 27885 1726882564.33349: done getting the remaining hosts for this loop 27885 1726882564.33353: getting the next task for host managed_node2 27885 1726882564.33362: done getting next task for host managed_node2 27885 1726882564.33365: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 27885 1726882564.33370: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882564.33374: getting variables 27885 1726882564.33376: in VariableManager get_vars() 27885 1726882564.33417: Calling all_inventory to load vars for managed_node2 27885 1726882564.33420: Calling groups_inventory to load vars for managed_node2 27885 1726882564.33422: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882564.33432: Calling all_plugins_play to load vars for managed_node2 27885 1726882564.33434: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882564.33436: Calling groups_plugins_play to load vars for managed_node2 27885 1726882564.34233: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882564.35113: done with get_vars() 27885 1726882564.35132: done getting variables 27885 1726882564.35173: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27885 1726882564.35261: variable 'profile' from source: include params 27885 1726882564.35265: variable 'item' from source: include params 27885 1726882564.35314: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-ethtest0] *********************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:36:04 -0400 (0:00:00.032) 0:00:36.995 ****** 27885 1726882564.35337: entering _queue_task() for managed_node2/command 27885 1726882564.35590: worker is 1 (out of 1 available) 27885 1726882564.35605: exiting _queue_task() for managed_node2/command 27885 1726882564.35620: done queuing things up, now waiting for results queue to drain 27885 1726882564.35622: waiting for pending results... 27885 1726882564.35788: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-ethtest0 27885 1726882564.35860: in run() - task 12673a56-9f93-3fa5-01be-000000000b42 27885 1726882564.35872: variable 'ansible_search_path' from source: unknown 27885 1726882564.35876: variable 'ansible_search_path' from source: unknown 27885 1726882564.35906: calling self._execute() 27885 1726882564.35983: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882564.35991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882564.36000: variable 'omit' from source: magic vars 27885 1726882564.36263: variable 'ansible_distribution_major_version' from source: facts 27885 1726882564.36272: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882564.36356: variable 'profile_stat' from source: set_fact 27885 1726882564.36366: Evaluated conditional (profile_stat.stat.exists): False 27885 1726882564.36369: when evaluation is False, skipping this task 27885 1726882564.36372: _execute() done 27885 1726882564.36376: dumping result to json 27885 1726882564.36379: done dumping result, returning 27885 1726882564.36389: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-ethtest0 [12673a56-9f93-3fa5-01be-000000000b42] 27885 1726882564.36392: sending task result for task 12673a56-9f93-3fa5-01be-000000000b42 27885 1726882564.36471: done sending task result for task 12673a56-9f93-3fa5-01be-000000000b42 27885 1726882564.36474: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 27885 1726882564.36548: no more pending results, returning what we have 27885 1726882564.36552: results queue empty 27885 1726882564.36553: checking for any_errors_fatal 27885 1726882564.36558: done checking for any_errors_fatal 27885 1726882564.36558: checking for max_fail_percentage 27885 1726882564.36560: done checking for max_fail_percentage 27885 1726882564.36561: checking to see if all hosts have failed and the running result is not ok 27885 1726882564.36561: done checking to see if all hosts have failed 27885 1726882564.36562: getting the remaining hosts for this loop 27885 1726882564.36564: done getting the remaining hosts for this loop 27885 1726882564.36567: getting the next task for host managed_node2 27885 1726882564.36573: done getting next task for host managed_node2 27885 1726882564.36575: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 27885 1726882564.36581: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882564.36584: getting variables 27885 1726882564.36585: in VariableManager get_vars() 27885 1726882564.36622: Calling all_inventory to load vars for managed_node2 27885 1726882564.36625: Calling groups_inventory to load vars for managed_node2 27885 1726882564.36627: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882564.36635: Calling all_plugins_play to load vars for managed_node2 27885 1726882564.36638: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882564.36640: Calling groups_plugins_play to load vars for managed_node2 27885 1726882564.37526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882564.38402: done with get_vars() 27885 1726882564.38421: done getting variables 27885 1726882564.38464: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27885 1726882564.38546: variable 'profile' from source: include params 27885 1726882564.38549: variable 'item' from source: include params 27885 1726882564.38590: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-ethtest0] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:36:04 -0400 (0:00:00.032) 0:00:37.028 ****** 27885 1726882564.38615: entering _queue_task() for managed_node2/set_fact 27885 1726882564.38868: worker is 1 (out of 1 available) 27885 1726882564.38882: exiting _queue_task() for managed_node2/set_fact 27885 1726882564.38902: done queuing things up, now waiting for results queue to drain 27885 1726882564.38904: waiting for pending results... 27885 1726882564.39075: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-ethtest0 27885 1726882564.39160: in run() - task 12673a56-9f93-3fa5-01be-000000000b43 27885 1726882564.39172: variable 'ansible_search_path' from source: unknown 27885 1726882564.39175: variable 'ansible_search_path' from source: unknown 27885 1726882564.39209: calling self._execute() 27885 1726882564.39285: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882564.39291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882564.39301: variable 'omit' from source: magic vars 27885 1726882564.39571: variable 'ansible_distribution_major_version' from source: facts 27885 1726882564.39580: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882564.39663: variable 'profile_stat' from source: set_fact 27885 1726882564.39674: Evaluated conditional (profile_stat.stat.exists): False 27885 1726882564.39677: when evaluation is False, skipping this task 27885 1726882564.39680: _execute() done 27885 1726882564.39683: dumping result to json 27885 1726882564.39688: done dumping result, returning 27885 1726882564.39692: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-ethtest0 [12673a56-9f93-3fa5-01be-000000000b43] 27885 1726882564.39702: sending task result for task 12673a56-9f93-3fa5-01be-000000000b43 27885 1726882564.39785: done sending task result for task 12673a56-9f93-3fa5-01be-000000000b43 27885 1726882564.39791: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 27885 1726882564.39839: no more pending results, returning what we have 27885 1726882564.39843: results queue empty 27885 1726882564.39845: checking for any_errors_fatal 27885 1726882564.39853: done checking for any_errors_fatal 27885 1726882564.39854: checking for max_fail_percentage 27885 1726882564.39855: done checking for max_fail_percentage 27885 1726882564.39856: checking to see if all hosts have failed and the running result is not ok 27885 1726882564.39857: done checking to see if all hosts have failed 27885 1726882564.39858: getting the remaining hosts for this loop 27885 1726882564.39859: done getting the remaining hosts for this loop 27885 1726882564.39863: getting the next task for host managed_node2 27885 1726882564.39871: done getting next task for host managed_node2 27885 1726882564.39873: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 27885 1726882564.39880: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882564.39884: getting variables 27885 1726882564.39888: in VariableManager get_vars() 27885 1726882564.39934: Calling all_inventory to load vars for managed_node2 27885 1726882564.39937: Calling groups_inventory to load vars for managed_node2 27885 1726882564.39939: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882564.39949: Calling all_plugins_play to load vars for managed_node2 27885 1726882564.39951: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882564.39953: Calling groups_plugins_play to load vars for managed_node2 27885 1726882564.40758: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882564.41746: done with get_vars() 27885 1726882564.41762: done getting variables 27885 1726882564.41808: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27885 1726882564.41885: variable 'profile' from source: include params 27885 1726882564.41890: variable 'item' from source: include params 27885 1726882564.41931: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-ethtest0] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:36:04 -0400 (0:00:00.033) 0:00:37.062 ****** 27885 1726882564.41953: entering _queue_task() for managed_node2/command 27885 1726882564.42205: worker is 1 (out of 1 available) 27885 1726882564.42219: exiting _queue_task() for managed_node2/command 27885 1726882564.42234: done queuing things up, now waiting for results queue to drain 27885 1726882564.42236: waiting for pending results... 27885 1726882564.42407: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-ethtest0 27885 1726882564.42482: in run() - task 12673a56-9f93-3fa5-01be-000000000b44 27885 1726882564.42496: variable 'ansible_search_path' from source: unknown 27885 1726882564.42500: variable 'ansible_search_path' from source: unknown 27885 1726882564.42529: calling self._execute() 27885 1726882564.42610: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882564.42615: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882564.42624: variable 'omit' from source: magic vars 27885 1726882564.42877: variable 'ansible_distribution_major_version' from source: facts 27885 1726882564.42890: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882564.42971: variable 'profile_stat' from source: set_fact 27885 1726882564.42981: Evaluated conditional (profile_stat.stat.exists): False 27885 1726882564.42984: when evaluation is False, skipping this task 27885 1726882564.42989: _execute() done 27885 1726882564.42992: dumping result to json 27885 1726882564.42996: done dumping result, returning 27885 1726882564.43006: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-ethtest0 [12673a56-9f93-3fa5-01be-000000000b44] 27885 1726882564.43011: sending task result for task 12673a56-9f93-3fa5-01be-000000000b44 27885 1726882564.43089: done sending task result for task 12673a56-9f93-3fa5-01be-000000000b44 27885 1726882564.43092: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 27885 1726882564.43168: no more pending results, returning what we have 27885 1726882564.43171: results queue empty 27885 1726882564.43172: checking for any_errors_fatal 27885 1726882564.43179: done checking for any_errors_fatal 27885 1726882564.43180: checking for max_fail_percentage 27885 1726882564.43181: done checking for max_fail_percentage 27885 1726882564.43182: checking to see if all hosts have failed and the running result is not ok 27885 1726882564.43182: done checking to see if all hosts have failed 27885 1726882564.43183: getting the remaining hosts for this loop 27885 1726882564.43185: done getting the remaining hosts for this loop 27885 1726882564.43190: getting the next task for host managed_node2 27885 1726882564.43199: done getting next task for host managed_node2 27885 1726882564.43202: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 27885 1726882564.43207: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882564.43210: getting variables 27885 1726882564.43211: in VariableManager get_vars() 27885 1726882564.43247: Calling all_inventory to load vars for managed_node2 27885 1726882564.43249: Calling groups_inventory to load vars for managed_node2 27885 1726882564.43251: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882564.43260: Calling all_plugins_play to load vars for managed_node2 27885 1726882564.43263: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882564.43265: Calling groups_plugins_play to load vars for managed_node2 27885 1726882564.44040: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882564.44919: done with get_vars() 27885 1726882564.44936: done getting variables 27885 1726882564.44977: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27885 1726882564.45055: variable 'profile' from source: include params 27885 1726882564.45058: variable 'item' from source: include params 27885 1726882564.45099: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-ethtest0] ************************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:36:04 -0400 (0:00:00.031) 0:00:37.093 ****** 27885 1726882564.45122: entering _queue_task() for managed_node2/set_fact 27885 1726882564.45352: worker is 1 (out of 1 available) 27885 1726882564.45366: exiting _queue_task() for managed_node2/set_fact 27885 1726882564.45380: done queuing things up, now waiting for results queue to drain 27885 1726882564.45381: waiting for pending results... 27885 1726882564.45549: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-ethtest0 27885 1726882564.45629: in run() - task 12673a56-9f93-3fa5-01be-000000000b45 27885 1726882564.45642: variable 'ansible_search_path' from source: unknown 27885 1726882564.45645: variable 'ansible_search_path' from source: unknown 27885 1726882564.45671: calling self._execute() 27885 1726882564.45748: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882564.45753: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882564.45761: variable 'omit' from source: magic vars 27885 1726882564.46011: variable 'ansible_distribution_major_version' from source: facts 27885 1726882564.46024: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882564.46109: variable 'profile_stat' from source: set_fact 27885 1726882564.46119: Evaluated conditional (profile_stat.stat.exists): False 27885 1726882564.46122: when evaluation is False, skipping this task 27885 1726882564.46125: _execute() done 27885 1726882564.46127: dumping result to json 27885 1726882564.46131: done dumping result, returning 27885 1726882564.46138: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-ethtest0 [12673a56-9f93-3fa5-01be-000000000b45] 27885 1726882564.46143: sending task result for task 12673a56-9f93-3fa5-01be-000000000b45 27885 1726882564.46229: done sending task result for task 12673a56-9f93-3fa5-01be-000000000b45 27885 1726882564.46232: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 27885 1726882564.46301: no more pending results, returning what we have 27885 1726882564.46304: results queue empty 27885 1726882564.46305: checking for any_errors_fatal 27885 1726882564.46309: done checking for any_errors_fatal 27885 1726882564.46310: checking for max_fail_percentage 27885 1726882564.46311: done checking for max_fail_percentage 27885 1726882564.46312: checking to see if all hosts have failed and the running result is not ok 27885 1726882564.46313: done checking to see if all hosts have failed 27885 1726882564.46313: getting the remaining hosts for this loop 27885 1726882564.46314: done getting the remaining hosts for this loop 27885 1726882564.46317: getting the next task for host managed_node2 27885 1726882564.46324: done getting next task for host managed_node2 27885 1726882564.46326: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 27885 1726882564.46329: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882564.46333: getting variables 27885 1726882564.46334: in VariableManager get_vars() 27885 1726882564.46366: Calling all_inventory to load vars for managed_node2 27885 1726882564.46368: Calling groups_inventory to load vars for managed_node2 27885 1726882564.46369: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882564.46378: Calling all_plugins_play to load vars for managed_node2 27885 1726882564.46380: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882564.46383: Calling groups_plugins_play to load vars for managed_node2 27885 1726882564.47285: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882564.48139: done with get_vars() 27885 1726882564.48155: done getting variables 27885 1726882564.48219: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27885 1726882564.48318: variable 'profile' from source: include params 27885 1726882564.48321: variable 'item' from source: include params 27885 1726882564.48374: variable 'item' from source: include params TASK [Assert that the profile is absent - 'ethtest0'] ************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Friday 20 September 2024 21:36:04 -0400 (0:00:00.032) 0:00:37.126 ****** 27885 1726882564.48407: entering _queue_task() for managed_node2/assert 27885 1726882564.48694: worker is 1 (out of 1 available) 27885 1726882564.48706: exiting _queue_task() for managed_node2/assert 27885 1726882564.48719: done queuing things up, now waiting for results queue to drain 27885 1726882564.48720: waiting for pending results... 27885 1726882564.49114: running TaskExecutor() for managed_node2/TASK: Assert that the profile is absent - 'ethtest0' 27885 1726882564.49121: in run() - task 12673a56-9f93-3fa5-01be-000000000a6d 27885 1726882564.49133: variable 'ansible_search_path' from source: unknown 27885 1726882564.49142: variable 'ansible_search_path' from source: unknown 27885 1726882564.49181: calling self._execute() 27885 1726882564.49289: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882564.49320: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882564.49336: variable 'omit' from source: magic vars 27885 1726882564.49607: variable 'ansible_distribution_major_version' from source: facts 27885 1726882564.49618: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882564.49624: variable 'omit' from source: magic vars 27885 1726882564.49658: variable 'omit' from source: magic vars 27885 1726882564.49729: variable 'profile' from source: include params 27885 1726882564.49733: variable 'item' from source: include params 27885 1726882564.49780: variable 'item' from source: include params 27885 1726882564.49800: variable 'omit' from source: magic vars 27885 1726882564.49831: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882564.49857: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882564.49873: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882564.49887: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882564.49903: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882564.49927: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882564.49930: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882564.49933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882564.50007: Set connection var ansible_pipelining to False 27885 1726882564.50010: Set connection var ansible_connection to ssh 27885 1726882564.50017: Set connection var ansible_timeout to 10 27885 1726882564.50019: Set connection var ansible_shell_type to sh 27885 1726882564.50024: Set connection var ansible_shell_executable to /bin/sh 27885 1726882564.50029: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882564.50058: variable 'ansible_shell_executable' from source: unknown 27885 1726882564.50061: variable 'ansible_connection' from source: unknown 27885 1726882564.50064: variable 'ansible_module_compression' from source: unknown 27885 1726882564.50066: variable 'ansible_shell_type' from source: unknown 27885 1726882564.50069: variable 'ansible_shell_executable' from source: unknown 27885 1726882564.50071: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882564.50076: variable 'ansible_pipelining' from source: unknown 27885 1726882564.50078: variable 'ansible_timeout' from source: unknown 27885 1726882564.50080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882564.50181: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882564.50194: variable 'omit' from source: magic vars 27885 1726882564.50197: starting attempt loop 27885 1726882564.50200: running the handler 27885 1726882564.50282: variable 'lsr_net_profile_exists' from source: set_fact 27885 1726882564.50286: Evaluated conditional (not lsr_net_profile_exists): True 27885 1726882564.50296: handler run complete 27885 1726882564.50307: attempt loop complete, returning result 27885 1726882564.50311: _execute() done 27885 1726882564.50314: dumping result to json 27885 1726882564.50317: done dumping result, returning 27885 1726882564.50322: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is absent - 'ethtest0' [12673a56-9f93-3fa5-01be-000000000a6d] 27885 1726882564.50328: sending task result for task 12673a56-9f93-3fa5-01be-000000000a6d 27885 1726882564.50404: done sending task result for task 12673a56-9f93-3fa5-01be-000000000a6d 27885 1726882564.50407: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 27885 1726882564.50471: no more pending results, returning what we have 27885 1726882564.50474: results queue empty 27885 1726882564.50475: checking for any_errors_fatal 27885 1726882564.50482: done checking for any_errors_fatal 27885 1726882564.50482: checking for max_fail_percentage 27885 1726882564.50484: done checking for max_fail_percentage 27885 1726882564.50485: checking to see if all hosts have failed and the running result is not ok 27885 1726882564.50486: done checking to see if all hosts have failed 27885 1726882564.50486: getting the remaining hosts for this loop 27885 1726882564.50488: done getting the remaining hosts for this loop 27885 1726882564.50491: getting the next task for host managed_node2 27885 1726882564.50502: done getting next task for host managed_node2 27885 1726882564.50505: ^ task is: TASK: Include the task 'get_profile_stat.yml' 27885 1726882564.50509: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882564.50513: getting variables 27885 1726882564.50514: in VariableManager get_vars() 27885 1726882564.50547: Calling all_inventory to load vars for managed_node2 27885 1726882564.50549: Calling groups_inventory to load vars for managed_node2 27885 1726882564.50552: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882564.50560: Calling all_plugins_play to load vars for managed_node2 27885 1726882564.50562: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882564.50565: Calling groups_plugins_play to load vars for managed_node2 27885 1726882564.51527: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882564.53057: done with get_vars() 27885 1726882564.53087: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Friday 20 September 2024 21:36:04 -0400 (0:00:00.047) 0:00:37.174 ****** 27885 1726882564.53184: entering _queue_task() for managed_node2/include_tasks 27885 1726882564.53539: worker is 1 (out of 1 available) 27885 1726882564.53551: exiting _queue_task() for managed_node2/include_tasks 27885 1726882564.53566: done queuing things up, now waiting for results queue to drain 27885 1726882564.53567: waiting for pending results... 27885 1726882564.53912: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 27885 1726882564.54005: in run() - task 12673a56-9f93-3fa5-01be-000000000a71 27885 1726882564.54101: variable 'ansible_search_path' from source: unknown 27885 1726882564.54105: variable 'ansible_search_path' from source: unknown 27885 1726882564.54108: calling self._execute() 27885 1726882564.54186: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882564.54201: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882564.54217: variable 'omit' from source: magic vars 27885 1726882564.54681: variable 'ansible_distribution_major_version' from source: facts 27885 1726882564.54746: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882564.54758: _execute() done 27885 1726882564.54769: dumping result to json 27885 1726882564.54778: done dumping result, returning 27885 1726882564.54790: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [12673a56-9f93-3fa5-01be-000000000a71] 27885 1726882564.54805: sending task result for task 12673a56-9f93-3fa5-01be-000000000a71 27885 1726882564.55019: done sending task result for task 12673a56-9f93-3fa5-01be-000000000a71 27885 1726882564.55023: WORKER PROCESS EXITING 27885 1726882564.55053: no more pending results, returning what we have 27885 1726882564.55059: in VariableManager get_vars() 27885 1726882564.55114: Calling all_inventory to load vars for managed_node2 27885 1726882564.55117: Calling groups_inventory to load vars for managed_node2 27885 1726882564.55119: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882564.55134: Calling all_plugins_play to load vars for managed_node2 27885 1726882564.55137: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882564.55140: Calling groups_plugins_play to load vars for managed_node2 27885 1726882564.57374: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882564.59835: done with get_vars() 27885 1726882564.59867: variable 'ansible_search_path' from source: unknown 27885 1726882564.59869: variable 'ansible_search_path' from source: unknown 27885 1726882564.59909: we have included files to process 27885 1726882564.59911: generating all_blocks data 27885 1726882564.59912: done generating all_blocks data 27885 1726882564.59917: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 27885 1726882564.59918: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 27885 1726882564.59919: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 27885 1726882564.60913: done processing included file 27885 1726882564.60915: iterating over new_blocks loaded from include file 27885 1726882564.60917: in VariableManager get_vars() 27885 1726882564.61065: done with get_vars() 27885 1726882564.61067: filtering new block on tags 27885 1726882564.61138: done filtering new block on tags 27885 1726882564.61142: in VariableManager get_vars() 27885 1726882564.61284: done with get_vars() 27885 1726882564.61286: filtering new block on tags 27885 1726882564.61346: done filtering new block on tags 27885 1726882564.61349: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 27885 1726882564.61354: extending task lists for all hosts with included blocks 27885 1726882564.61711: done extending task lists 27885 1726882564.61713: done processing included files 27885 1726882564.61714: results queue empty 27885 1726882564.61714: checking for any_errors_fatal 27885 1726882564.61718: done checking for any_errors_fatal 27885 1726882564.61718: checking for max_fail_percentage 27885 1726882564.61719: done checking for max_fail_percentage 27885 1726882564.61720: checking to see if all hosts have failed and the running result is not ok 27885 1726882564.61721: done checking to see if all hosts have failed 27885 1726882564.61722: getting the remaining hosts for this loop 27885 1726882564.61723: done getting the remaining hosts for this loop 27885 1726882564.61726: getting the next task for host managed_node2 27885 1726882564.61731: done getting next task for host managed_node2 27885 1726882564.61733: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 27885 1726882564.61737: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882564.61739: getting variables 27885 1726882564.61740: in VariableManager get_vars() 27885 1726882564.61756: Calling all_inventory to load vars for managed_node2 27885 1726882564.61759: Calling groups_inventory to load vars for managed_node2 27885 1726882564.61761: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882564.61767: Calling all_plugins_play to load vars for managed_node2 27885 1726882564.61769: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882564.61772: Calling groups_plugins_play to load vars for managed_node2 27885 1726882564.65042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882564.68225: done with get_vars() 27885 1726882564.68248: done getting variables 27885 1726882564.68305: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:36:04 -0400 (0:00:00.151) 0:00:37.325 ****** 27885 1726882564.68337: entering _queue_task() for managed_node2/set_fact 27885 1726882564.68727: worker is 1 (out of 1 available) 27885 1726882564.68739: exiting _queue_task() for managed_node2/set_fact 27885 1726882564.68752: done queuing things up, now waiting for results queue to drain 27885 1726882564.68753: waiting for pending results... 27885 1726882564.69214: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 27885 1726882564.69221: in run() - task 12673a56-9f93-3fa5-01be-000000000b79 27885 1726882564.69229: variable 'ansible_search_path' from source: unknown 27885 1726882564.69232: variable 'ansible_search_path' from source: unknown 27885 1726882564.69244: calling self._execute() 27885 1726882564.69353: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882564.69359: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882564.69369: variable 'omit' from source: magic vars 27885 1726882564.69758: variable 'ansible_distribution_major_version' from source: facts 27885 1726882564.69778: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882564.69784: variable 'omit' from source: magic vars 27885 1726882564.69847: variable 'omit' from source: magic vars 27885 1726882564.69887: variable 'omit' from source: magic vars 27885 1726882564.69935: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882564.69970: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882564.69995: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882564.70014: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882564.70032: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882564.70062: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882564.70065: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882564.70067: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882564.70373: Set connection var ansible_pipelining to False 27885 1726882564.70376: Set connection var ansible_connection to ssh 27885 1726882564.70379: Set connection var ansible_timeout to 10 27885 1726882564.70381: Set connection var ansible_shell_type to sh 27885 1726882564.70383: Set connection var ansible_shell_executable to /bin/sh 27885 1726882564.70386: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882564.70388: variable 'ansible_shell_executable' from source: unknown 27885 1726882564.70390: variable 'ansible_connection' from source: unknown 27885 1726882564.70395: variable 'ansible_module_compression' from source: unknown 27885 1726882564.70398: variable 'ansible_shell_type' from source: unknown 27885 1726882564.70401: variable 'ansible_shell_executable' from source: unknown 27885 1726882564.70402: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882564.70404: variable 'ansible_pipelining' from source: unknown 27885 1726882564.70406: variable 'ansible_timeout' from source: unknown 27885 1726882564.70408: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882564.70674: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882564.70685: variable 'omit' from source: magic vars 27885 1726882564.70699: starting attempt loop 27885 1726882564.70708: running the handler 27885 1726882564.70722: handler run complete 27885 1726882564.70733: attempt loop complete, returning result 27885 1726882564.70736: _execute() done 27885 1726882564.70738: dumping result to json 27885 1726882564.70741: done dumping result, returning 27885 1726882564.70749: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [12673a56-9f93-3fa5-01be-000000000b79] 27885 1726882564.70753: sending task result for task 12673a56-9f93-3fa5-01be-000000000b79 27885 1726882564.70957: done sending task result for task 12673a56-9f93-3fa5-01be-000000000b79 27885 1726882564.70961: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 27885 1726882564.71102: no more pending results, returning what we have 27885 1726882564.71107: results queue empty 27885 1726882564.71108: checking for any_errors_fatal 27885 1726882564.71110: done checking for any_errors_fatal 27885 1726882564.71110: checking for max_fail_percentage 27885 1726882564.71112: done checking for max_fail_percentage 27885 1726882564.71113: checking to see if all hosts have failed and the running result is not ok 27885 1726882564.71114: done checking to see if all hosts have failed 27885 1726882564.71114: getting the remaining hosts for this loop 27885 1726882564.71116: done getting the remaining hosts for this loop 27885 1726882564.71120: getting the next task for host managed_node2 27885 1726882564.71130: done getting next task for host managed_node2 27885 1726882564.71137: ^ task is: TASK: Stat profile file 27885 1726882564.71144: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882564.71149: getting variables 27885 1726882564.71151: in VariableManager get_vars() 27885 1726882564.71196: Calling all_inventory to load vars for managed_node2 27885 1726882564.71200: Calling groups_inventory to load vars for managed_node2 27885 1726882564.71202: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882564.71214: Calling all_plugins_play to load vars for managed_node2 27885 1726882564.71217: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882564.71220: Calling groups_plugins_play to load vars for managed_node2 27885 1726882564.73238: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882564.75192: done with get_vars() 27885 1726882564.75221: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:36:04 -0400 (0:00:00.069) 0:00:37.395 ****** 27885 1726882564.75328: entering _queue_task() for managed_node2/stat 27885 1726882564.75741: worker is 1 (out of 1 available) 27885 1726882564.75753: exiting _queue_task() for managed_node2/stat 27885 1726882564.75765: done queuing things up, now waiting for results queue to drain 27885 1726882564.75766: waiting for pending results... 27885 1726882564.76198: running TaskExecutor() for managed_node2/TASK: Stat profile file 27885 1726882564.76203: in run() - task 12673a56-9f93-3fa5-01be-000000000b7a 27885 1726882564.76207: variable 'ansible_search_path' from source: unknown 27885 1726882564.76210: variable 'ansible_search_path' from source: unknown 27885 1726882564.76213: calling self._execute() 27885 1726882564.76311: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882564.76318: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882564.76334: variable 'omit' from source: magic vars 27885 1726882564.76763: variable 'ansible_distribution_major_version' from source: facts 27885 1726882564.76777: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882564.76783: variable 'omit' from source: magic vars 27885 1726882564.76849: variable 'omit' from source: magic vars 27885 1726882564.76956: variable 'profile' from source: include params 27885 1726882564.76960: variable 'item' from source: include params 27885 1726882564.77031: variable 'item' from source: include params 27885 1726882564.77059: variable 'omit' from source: magic vars 27885 1726882564.77106: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882564.77141: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882564.77167: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882564.77184: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882564.77202: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882564.77234: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882564.77238: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882564.77240: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882564.77598: Set connection var ansible_pipelining to False 27885 1726882564.77601: Set connection var ansible_connection to ssh 27885 1726882564.77603: Set connection var ansible_timeout to 10 27885 1726882564.77606: Set connection var ansible_shell_type to sh 27885 1726882564.77608: Set connection var ansible_shell_executable to /bin/sh 27885 1726882564.77610: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882564.77612: variable 'ansible_shell_executable' from source: unknown 27885 1726882564.77615: variable 'ansible_connection' from source: unknown 27885 1726882564.77617: variable 'ansible_module_compression' from source: unknown 27885 1726882564.77618: variable 'ansible_shell_type' from source: unknown 27885 1726882564.77620: variable 'ansible_shell_executable' from source: unknown 27885 1726882564.77622: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882564.77624: variable 'ansible_pipelining' from source: unknown 27885 1726882564.77626: variable 'ansible_timeout' from source: unknown 27885 1726882564.77628: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882564.77641: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 27885 1726882564.77652: variable 'omit' from source: magic vars 27885 1726882564.77658: starting attempt loop 27885 1726882564.77661: running the handler 27885 1726882564.77677: _low_level_execute_command(): starting 27885 1726882564.77684: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27885 1726882564.78479: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882564.78574: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882564.78619: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882564.78695: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882564.80372: stdout chunk (state=3): >>>/root <<< 27885 1726882564.80512: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882564.80533: stderr chunk (state=3): >>><<< 27885 1726882564.80552: stdout chunk (state=3): >>><<< 27885 1726882564.80578: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882564.80605: _low_level_execute_command(): starting 27885 1726882564.80617: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882564.8058503-29574-7581061977858 `" && echo ansible-tmp-1726882564.8058503-29574-7581061977858="` echo /root/.ansible/tmp/ansible-tmp-1726882564.8058503-29574-7581061977858 `" ) && sleep 0' 27885 1726882564.81278: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882564.81309: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882564.81349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration <<< 27885 1726882564.81458: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882564.81479: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882564.81581: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882564.83533: stdout chunk (state=3): >>>ansible-tmp-1726882564.8058503-29574-7581061977858=/root/.ansible/tmp/ansible-tmp-1726882564.8058503-29574-7581061977858 <<< 27885 1726882564.83620: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882564.83648: stderr chunk (state=3): >>><<< 27885 1726882564.83651: stdout chunk (state=3): >>><<< 27885 1726882564.83668: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882564.8058503-29574-7581061977858=/root/.ansible/tmp/ansible-tmp-1726882564.8058503-29574-7581061977858 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882564.83799: variable 'ansible_module_compression' from source: unknown 27885 1726882564.83803: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27885_1t3g5hz/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 27885 1726882564.83834: variable 'ansible_facts' from source: unknown 27885 1726882564.83938: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882564.8058503-29574-7581061977858/AnsiballZ_stat.py 27885 1726882564.84166: Sending initial data 27885 1726882564.84175: Sent initial data (151 bytes) 27885 1726882564.84713: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882564.84720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882564.84728: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882564.84735: stderr chunk (state=3): >>>debug2: match found <<< 27885 1726882564.84813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882564.84825: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882564.84851: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882564.84925: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882564.86456: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27885 1726882564.86524: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27885 1726882564.86589: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpemqn2msy /root/.ansible/tmp/ansible-tmp-1726882564.8058503-29574-7581061977858/AnsiballZ_stat.py <<< 27885 1726882564.86592: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882564.8058503-29574-7581061977858/AnsiballZ_stat.py" <<< 27885 1726882564.86683: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmpemqn2msy" to remote "/root/.ansible/tmp/ansible-tmp-1726882564.8058503-29574-7581061977858/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882564.8058503-29574-7581061977858/AnsiballZ_stat.py" <<< 27885 1726882564.87455: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882564.87511: stderr chunk (state=3): >>><<< 27885 1726882564.87563: stdout chunk (state=3): >>><<< 27885 1726882564.87567: done transferring module to remote 27885 1726882564.87583: _low_level_execute_command(): starting 27885 1726882564.87592: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882564.8058503-29574-7581061977858/ /root/.ansible/tmp/ansible-tmp-1726882564.8058503-29574-7581061977858/AnsiballZ_stat.py && sleep 0' 27885 1726882564.88180: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882564.88197: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882564.88212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882564.88227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882564.88329: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882564.88355: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882564.88371: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882564.88461: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882564.90168: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882564.90221: stderr chunk (state=3): >>><<< 27885 1726882564.90238: stdout chunk (state=3): >>><<< 27885 1726882564.90259: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882564.90267: _low_level_execute_command(): starting 27885 1726882564.90277: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882564.8058503-29574-7581061977858/AnsiballZ_stat.py && sleep 0' 27885 1726882564.90875: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882564.90890: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882564.90906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882564.90924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882564.90951: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882564.90964: stderr chunk (state=3): >>>debug2: match not found <<< 27885 1726882564.90979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882564.91066: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27885 1726882564.91069: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882564.91097: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882564.91112: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882564.91130: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882564.91223: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882565.06197: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 27885 1726882565.07425: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 27885 1726882565.07429: stdout chunk (state=3): >>><<< 27885 1726882565.07432: stderr chunk (state=3): >>><<< 27885 1726882565.07448: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 27885 1726882565.07585: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-ethtest1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882564.8058503-29574-7581061977858/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27885 1726882565.07592: _low_level_execute_command(): starting 27885 1726882565.07597: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882564.8058503-29574-7581061977858/ > /dev/null 2>&1 && sleep 0' 27885 1726882565.08541: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882565.08779: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882565.08798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882565.08822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882565.08909: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882565.08962: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882565.08987: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882565.09015: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882565.09140: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882565.10939: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882565.11008: stderr chunk (state=3): >>><<< 27885 1726882565.11011: stdout chunk (state=3): >>><<< 27885 1726882565.11030: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882565.11036: handler run complete 27885 1726882565.11059: attempt loop complete, returning result 27885 1726882565.11062: _execute() done 27885 1726882565.11064: dumping result to json 27885 1726882565.11067: done dumping result, returning 27885 1726882565.11076: done running TaskExecutor() for managed_node2/TASK: Stat profile file [12673a56-9f93-3fa5-01be-000000000b7a] 27885 1726882565.11081: sending task result for task 12673a56-9f93-3fa5-01be-000000000b7a 27885 1726882565.11177: done sending task result for task 12673a56-9f93-3fa5-01be-000000000b7a 27885 1726882565.11179: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 27885 1726882565.11471: no more pending results, returning what we have 27885 1726882565.11475: results queue empty 27885 1726882565.11476: checking for any_errors_fatal 27885 1726882565.11484: done checking for any_errors_fatal 27885 1726882565.11485: checking for max_fail_percentage 27885 1726882565.11490: done checking for max_fail_percentage 27885 1726882565.11491: checking to see if all hosts have failed and the running result is not ok 27885 1726882565.11492: done checking to see if all hosts have failed 27885 1726882565.11492: getting the remaining hosts for this loop 27885 1726882565.11496: done getting the remaining hosts for this loop 27885 1726882565.11500: getting the next task for host managed_node2 27885 1726882565.11507: done getting next task for host managed_node2 27885 1726882565.11509: ^ task is: TASK: Set NM profile exist flag based on the profile files 27885 1726882565.11514: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882565.11519: getting variables 27885 1726882565.11521: in VariableManager get_vars() 27885 1726882565.11560: Calling all_inventory to load vars for managed_node2 27885 1726882565.11563: Calling groups_inventory to load vars for managed_node2 27885 1726882565.11565: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882565.11575: Calling all_plugins_play to load vars for managed_node2 27885 1726882565.11578: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882565.11580: Calling groups_plugins_play to load vars for managed_node2 27885 1726882565.13579: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882565.15396: done with get_vars() 27885 1726882565.15420: done getting variables 27885 1726882565.15478: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:36:05 -0400 (0:00:00.401) 0:00:37.797 ****** 27885 1726882565.15518: entering _queue_task() for managed_node2/set_fact 27885 1726882565.15871: worker is 1 (out of 1 available) 27885 1726882565.15882: exiting _queue_task() for managed_node2/set_fact 27885 1726882565.16045: done queuing things up, now waiting for results queue to drain 27885 1726882565.16047: waiting for pending results... 27885 1726882565.16385: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 27885 1726882565.16392: in run() - task 12673a56-9f93-3fa5-01be-000000000b7b 27885 1726882565.16398: variable 'ansible_search_path' from source: unknown 27885 1726882565.16402: variable 'ansible_search_path' from source: unknown 27885 1726882565.16480: calling self._execute() 27885 1726882565.16749: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882565.16752: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882565.16755: variable 'omit' from source: magic vars 27885 1726882565.17601: variable 'ansible_distribution_major_version' from source: facts 27885 1726882565.17605: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882565.17864: variable 'profile_stat' from source: set_fact 27885 1726882565.17912: Evaluated conditional (profile_stat.stat.exists): False 27885 1726882565.17921: when evaluation is False, skipping this task 27885 1726882565.17930: _execute() done 27885 1726882565.17936: dumping result to json 27885 1726882565.17943: done dumping result, returning 27885 1726882565.17960: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [12673a56-9f93-3fa5-01be-000000000b7b] 27885 1726882565.18099: sending task result for task 12673a56-9f93-3fa5-01be-000000000b7b skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 27885 1726882565.18373: no more pending results, returning what we have 27885 1726882565.18378: results queue empty 27885 1726882565.18379: checking for any_errors_fatal 27885 1726882565.18394: done checking for any_errors_fatal 27885 1726882565.18395: checking for max_fail_percentage 27885 1726882565.18398: done checking for max_fail_percentage 27885 1726882565.18399: checking to see if all hosts have failed and the running result is not ok 27885 1726882565.18400: done checking to see if all hosts have failed 27885 1726882565.18401: getting the remaining hosts for this loop 27885 1726882565.18403: done getting the remaining hosts for this loop 27885 1726882565.18406: getting the next task for host managed_node2 27885 1726882565.18416: done getting next task for host managed_node2 27885 1726882565.18419: ^ task is: TASK: Get NM profile info 27885 1726882565.18425: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882565.18430: getting variables 27885 1726882565.18432: in VariableManager get_vars() 27885 1726882565.18473: Calling all_inventory to load vars for managed_node2 27885 1726882565.18476: Calling groups_inventory to load vars for managed_node2 27885 1726882565.18478: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882565.18529: Calling all_plugins_play to load vars for managed_node2 27885 1726882565.18534: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882565.18543: done sending task result for task 12673a56-9f93-3fa5-01be-000000000b7b 27885 1726882565.18546: WORKER PROCESS EXITING 27885 1726882565.18551: Calling groups_plugins_play to load vars for managed_node2 27885 1726882565.20160: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882565.22148: done with get_vars() 27885 1726882565.22179: done getting variables 27885 1726882565.22248: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:36:05 -0400 (0:00:00.067) 0:00:37.865 ****** 27885 1726882565.22301: entering _queue_task() for managed_node2/shell 27885 1726882565.22739: worker is 1 (out of 1 available) 27885 1726882565.22752: exiting _queue_task() for managed_node2/shell 27885 1726882565.22766: done queuing things up, now waiting for results queue to drain 27885 1726882565.22768: waiting for pending results... 27885 1726882565.23057: running TaskExecutor() for managed_node2/TASK: Get NM profile info 27885 1726882565.23273: in run() - task 12673a56-9f93-3fa5-01be-000000000b7c 27885 1726882565.23281: variable 'ansible_search_path' from source: unknown 27885 1726882565.23285: variable 'ansible_search_path' from source: unknown 27885 1726882565.23291: calling self._execute() 27885 1726882565.23395: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882565.23414: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882565.23431: variable 'omit' from source: magic vars 27885 1726882565.23837: variable 'ansible_distribution_major_version' from source: facts 27885 1726882565.23857: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882565.23898: variable 'omit' from source: magic vars 27885 1726882565.23937: variable 'omit' from source: magic vars 27885 1726882565.24050: variable 'profile' from source: include params 27885 1726882565.24065: variable 'item' from source: include params 27885 1726882565.24141: variable 'item' from source: include params 27885 1726882565.24251: variable 'omit' from source: magic vars 27885 1726882565.24254: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882565.24260: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882565.24292: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882565.24317: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882565.24336: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882565.24373: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882565.24385: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882565.24400: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882565.24576: Set connection var ansible_pipelining to False 27885 1726882565.24579: Set connection var ansible_connection to ssh 27885 1726882565.24582: Set connection var ansible_timeout to 10 27885 1726882565.24585: Set connection var ansible_shell_type to sh 27885 1726882565.24590: Set connection var ansible_shell_executable to /bin/sh 27885 1726882565.24595: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882565.24597: variable 'ansible_shell_executable' from source: unknown 27885 1726882565.24599: variable 'ansible_connection' from source: unknown 27885 1726882565.24603: variable 'ansible_module_compression' from source: unknown 27885 1726882565.24606: variable 'ansible_shell_type' from source: unknown 27885 1726882565.24608: variable 'ansible_shell_executable' from source: unknown 27885 1726882565.24610: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882565.24685: variable 'ansible_pipelining' from source: unknown 27885 1726882565.24692: variable 'ansible_timeout' from source: unknown 27885 1726882565.24696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882565.24790: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882565.24833: variable 'omit' from source: magic vars 27885 1726882565.24836: starting attempt loop 27885 1726882565.24838: running the handler 27885 1726882565.24843: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882565.24868: _low_level_execute_command(): starting 27885 1726882565.24908: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27885 1726882565.25631: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882565.25685: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882565.25777: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882565.25799: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882565.25832: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882565.25923: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882565.27504: stdout chunk (state=3): >>>/root <<< 27885 1726882565.27663: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882565.27666: stdout chunk (state=3): >>><<< 27885 1726882565.27669: stderr chunk (state=3): >>><<< 27885 1726882565.27776: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882565.27782: _low_level_execute_command(): starting 27885 1726882565.27786: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882565.2769487-29600-223180596682414 `" && echo ansible-tmp-1726882565.2769487-29600-223180596682414="` echo /root/.ansible/tmp/ansible-tmp-1726882565.2769487-29600-223180596682414 `" ) && sleep 0' 27885 1726882565.28343: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882565.28411: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882565.28461: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882565.28484: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882565.28508: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882565.28601: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882565.30475: stdout chunk (state=3): >>>ansible-tmp-1726882565.2769487-29600-223180596682414=/root/.ansible/tmp/ansible-tmp-1726882565.2769487-29600-223180596682414 <<< 27885 1726882565.30798: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882565.30802: stdout chunk (state=3): >>><<< 27885 1726882565.30804: stderr chunk (state=3): >>><<< 27885 1726882565.30807: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882565.2769487-29600-223180596682414=/root/.ansible/tmp/ansible-tmp-1726882565.2769487-29600-223180596682414 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882565.30809: variable 'ansible_module_compression' from source: unknown 27885 1726882565.30811: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27885_1t3g5hz/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27885 1726882565.30829: variable 'ansible_facts' from source: unknown 27885 1726882565.30923: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882565.2769487-29600-223180596682414/AnsiballZ_command.py 27885 1726882565.31081: Sending initial data 27885 1726882565.31091: Sent initial data (156 bytes) 27885 1726882565.31706: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882565.31722: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882565.31815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882565.31854: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882565.31871: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882565.31900: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882565.31987: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882565.33546: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27885 1726882565.33627: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27885 1726882565.33690: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmptoaxapdy /root/.ansible/tmp/ansible-tmp-1726882565.2769487-29600-223180596682414/AnsiballZ_command.py <<< 27885 1726882565.33698: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882565.2769487-29600-223180596682414/AnsiballZ_command.py" <<< 27885 1726882565.33764: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmptoaxapdy" to remote "/root/.ansible/tmp/ansible-tmp-1726882565.2769487-29600-223180596682414/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882565.2769487-29600-223180596682414/AnsiballZ_command.py" <<< 27885 1726882565.34656: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882565.34660: stdout chunk (state=3): >>><<< 27885 1726882565.34662: stderr chunk (state=3): >>><<< 27885 1726882565.34664: done transferring module to remote 27885 1726882565.34667: _low_level_execute_command(): starting 27885 1726882565.34669: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882565.2769487-29600-223180596682414/ /root/.ansible/tmp/ansible-tmp-1726882565.2769487-29600-223180596682414/AnsiballZ_command.py && sleep 0' 27885 1726882565.35382: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882565.35407: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882565.35433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882565.35537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882565.35569: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882565.35660: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882565.37494: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882565.37499: stdout chunk (state=3): >>><<< 27885 1726882565.37501: stderr chunk (state=3): >>><<< 27885 1726882565.37504: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882565.37506: _low_level_execute_command(): starting 27885 1726882565.37509: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882565.2769487-29600-223180596682414/AnsiballZ_command.py && sleep 0' 27885 1726882565.38008: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882565.38014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 27885 1726882565.38051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882565.38113: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882565.38142: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882565.38233: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882565.54834: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest1 | grep /etc", "start": "2024-09-20 21:36:05.531314", "end": "2024-09-20 21:36:05.547601", "delta": "0:00:00.016287", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27885 1726882565.56180: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.14.69 closed. <<< 27885 1726882565.56217: stderr chunk (state=3): >>><<< 27885 1726882565.56221: stdout chunk (state=3): >>><<< 27885 1726882565.56234: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest1 | grep /etc", "start": "2024-09-20 21:36:05.531314", "end": "2024-09-20 21:36:05.547601", "delta": "0:00:00.016287", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.14.69 closed. 27885 1726882565.56262: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep ethtest1 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882565.2769487-29600-223180596682414/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27885 1726882565.56269: _low_level_execute_command(): starting 27885 1726882565.56274: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882565.2769487-29600-223180596682414/ > /dev/null 2>&1 && sleep 0' 27885 1726882565.56701: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882565.56710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882565.56730: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882565.56777: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882565.56781: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882565.56850: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882565.58799: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882565.58803: stdout chunk (state=3): >>><<< 27885 1726882565.58805: stderr chunk (state=3): >>><<< 27885 1726882565.58807: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882565.58810: handler run complete 27885 1726882565.58812: Evaluated conditional (False): False 27885 1726882565.58814: attempt loop complete, returning result 27885 1726882565.58816: _execute() done 27885 1726882565.58818: dumping result to json 27885 1726882565.58819: done dumping result, returning 27885 1726882565.58821: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [12673a56-9f93-3fa5-01be-000000000b7c] 27885 1726882565.58823: sending task result for task 12673a56-9f93-3fa5-01be-000000000b7c 27885 1726882565.58899: done sending task result for task 12673a56-9f93-3fa5-01be-000000000b7c 27885 1726882565.58902: WORKER PROCESS EXITING fatal: [managed_node2]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest1 | grep /etc", "delta": "0:00:00.016287", "end": "2024-09-20 21:36:05.547601", "rc": 1, "start": "2024-09-20 21:36:05.531314" } MSG: non-zero return code ...ignoring 27885 1726882565.58983: no more pending results, returning what we have 27885 1726882565.58990: results queue empty 27885 1726882565.58991: checking for any_errors_fatal 27885 1726882565.59000: done checking for any_errors_fatal 27885 1726882565.59001: checking for max_fail_percentage 27885 1726882565.59003: done checking for max_fail_percentage 27885 1726882565.59004: checking to see if all hosts have failed and the running result is not ok 27885 1726882565.59004: done checking to see if all hosts have failed 27885 1726882565.59005: getting the remaining hosts for this loop 27885 1726882565.59007: done getting the remaining hosts for this loop 27885 1726882565.59010: getting the next task for host managed_node2 27885 1726882565.59026: done getting next task for host managed_node2 27885 1726882565.59029: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 27885 1726882565.59035: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882565.59040: getting variables 27885 1726882565.59041: in VariableManager get_vars() 27885 1726882565.59085: Calling all_inventory to load vars for managed_node2 27885 1726882565.59092: Calling groups_inventory to load vars for managed_node2 27885 1726882565.59244: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882565.59256: Calling all_plugins_play to load vars for managed_node2 27885 1726882565.59258: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882565.59261: Calling groups_plugins_play to load vars for managed_node2 27885 1726882565.65552: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882565.66666: done with get_vars() 27885 1726882565.66682: done getting variables 27885 1726882565.66721: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:36:05 -0400 (0:00:00.444) 0:00:38.309 ****** 27885 1726882565.66741: entering _queue_task() for managed_node2/set_fact 27885 1726882565.66992: worker is 1 (out of 1 available) 27885 1726882565.67005: exiting _queue_task() for managed_node2/set_fact 27885 1726882565.67017: done queuing things up, now waiting for results queue to drain 27885 1726882565.67019: waiting for pending results... 27885 1726882565.67215: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 27885 1726882565.67299: in run() - task 12673a56-9f93-3fa5-01be-000000000b7d 27885 1726882565.67312: variable 'ansible_search_path' from source: unknown 27885 1726882565.67316: variable 'ansible_search_path' from source: unknown 27885 1726882565.67342: calling self._execute() 27885 1726882565.67424: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882565.67430: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882565.67440: variable 'omit' from source: magic vars 27885 1726882565.67720: variable 'ansible_distribution_major_version' from source: facts 27885 1726882565.67730: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882565.67824: variable 'nm_profile_exists' from source: set_fact 27885 1726882565.67836: Evaluated conditional (nm_profile_exists.rc == 0): False 27885 1726882565.67840: when evaluation is False, skipping this task 27885 1726882565.67843: _execute() done 27885 1726882565.67845: dumping result to json 27885 1726882565.67848: done dumping result, returning 27885 1726882565.67855: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [12673a56-9f93-3fa5-01be-000000000b7d] 27885 1726882565.67860: sending task result for task 12673a56-9f93-3fa5-01be-000000000b7d 27885 1726882565.67943: done sending task result for task 12673a56-9f93-3fa5-01be-000000000b7d 27885 1726882565.67946: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 27885 1726882565.67988: no more pending results, returning what we have 27885 1726882565.67992: results queue empty 27885 1726882565.67995: checking for any_errors_fatal 27885 1726882565.68005: done checking for any_errors_fatal 27885 1726882565.68005: checking for max_fail_percentage 27885 1726882565.68007: done checking for max_fail_percentage 27885 1726882565.68008: checking to see if all hosts have failed and the running result is not ok 27885 1726882565.68009: done checking to see if all hosts have failed 27885 1726882565.68009: getting the remaining hosts for this loop 27885 1726882565.68011: done getting the remaining hosts for this loop 27885 1726882565.68015: getting the next task for host managed_node2 27885 1726882565.68025: done getting next task for host managed_node2 27885 1726882565.68027: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 27885 1726882565.68034: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882565.68039: getting variables 27885 1726882565.68041: in VariableManager get_vars() 27885 1726882565.68079: Calling all_inventory to load vars for managed_node2 27885 1726882565.68081: Calling groups_inventory to load vars for managed_node2 27885 1726882565.68084: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882565.68095: Calling all_plugins_play to load vars for managed_node2 27885 1726882565.68098: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882565.68100: Calling groups_plugins_play to load vars for managed_node2 27885 1726882565.68859: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882565.69739: done with get_vars() 27885 1726882565.69754: done getting variables 27885 1726882565.69797: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27885 1726882565.69878: variable 'profile' from source: include params 27885 1726882565.69880: variable 'item' from source: include params 27885 1726882565.69922: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-ethtest1] *********************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:36:05 -0400 (0:00:00.032) 0:00:38.341 ****** 27885 1726882565.69947: entering _queue_task() for managed_node2/command 27885 1726882565.70154: worker is 1 (out of 1 available) 27885 1726882565.70166: exiting _queue_task() for managed_node2/command 27885 1726882565.70179: done queuing things up, now waiting for results queue to drain 27885 1726882565.70180: waiting for pending results... 27885 1726882565.70360: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-ethtest1 27885 1726882565.70454: in run() - task 12673a56-9f93-3fa5-01be-000000000b7f 27885 1726882565.70466: variable 'ansible_search_path' from source: unknown 27885 1726882565.70469: variable 'ansible_search_path' from source: unknown 27885 1726882565.70498: calling self._execute() 27885 1726882565.70574: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882565.70578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882565.70590: variable 'omit' from source: magic vars 27885 1726882565.70855: variable 'ansible_distribution_major_version' from source: facts 27885 1726882565.70865: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882565.70949: variable 'profile_stat' from source: set_fact 27885 1726882565.70961: Evaluated conditional (profile_stat.stat.exists): False 27885 1726882565.70964: when evaluation is False, skipping this task 27885 1726882565.70967: _execute() done 27885 1726882565.70970: dumping result to json 27885 1726882565.70972: done dumping result, returning 27885 1726882565.70975: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-ethtest1 [12673a56-9f93-3fa5-01be-000000000b7f] 27885 1726882565.70981: sending task result for task 12673a56-9f93-3fa5-01be-000000000b7f 27885 1726882565.71057: done sending task result for task 12673a56-9f93-3fa5-01be-000000000b7f 27885 1726882565.71062: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 27885 1726882565.71112: no more pending results, returning what we have 27885 1726882565.71115: results queue empty 27885 1726882565.71117: checking for any_errors_fatal 27885 1726882565.71124: done checking for any_errors_fatal 27885 1726882565.71125: checking for max_fail_percentage 27885 1726882565.71126: done checking for max_fail_percentage 27885 1726882565.71127: checking to see if all hosts have failed and the running result is not ok 27885 1726882565.71128: done checking to see if all hosts have failed 27885 1726882565.71129: getting the remaining hosts for this loop 27885 1726882565.71130: done getting the remaining hosts for this loop 27885 1726882565.71133: getting the next task for host managed_node2 27885 1726882565.71141: done getting next task for host managed_node2 27885 1726882565.71143: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 27885 1726882565.71149: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882565.71153: getting variables 27885 1726882565.71154: in VariableManager get_vars() 27885 1726882565.71191: Calling all_inventory to load vars for managed_node2 27885 1726882565.71201: Calling groups_inventory to load vars for managed_node2 27885 1726882565.71204: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882565.71214: Calling all_plugins_play to load vars for managed_node2 27885 1726882565.71216: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882565.71219: Calling groups_plugins_play to load vars for managed_node2 27885 1726882565.72071: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882565.72940: done with get_vars() 27885 1726882565.72954: done getting variables 27885 1726882565.72998: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27885 1726882565.73071: variable 'profile' from source: include params 27885 1726882565.73074: variable 'item' from source: include params 27885 1726882565.73115: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-ethtest1] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:36:05 -0400 (0:00:00.031) 0:00:38.373 ****** 27885 1726882565.73136: entering _queue_task() for managed_node2/set_fact 27885 1726882565.73351: worker is 1 (out of 1 available) 27885 1726882565.73364: exiting _queue_task() for managed_node2/set_fact 27885 1726882565.73376: done queuing things up, now waiting for results queue to drain 27885 1726882565.73378: waiting for pending results... 27885 1726882565.73547: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-ethtest1 27885 1726882565.73635: in run() - task 12673a56-9f93-3fa5-01be-000000000b80 27885 1726882565.73646: variable 'ansible_search_path' from source: unknown 27885 1726882565.73649: variable 'ansible_search_path' from source: unknown 27885 1726882565.73676: calling self._execute() 27885 1726882565.73752: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882565.73757: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882565.73765: variable 'omit' from source: magic vars 27885 1726882565.74020: variable 'ansible_distribution_major_version' from source: facts 27885 1726882565.74030: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882565.74114: variable 'profile_stat' from source: set_fact 27885 1726882565.74123: Evaluated conditional (profile_stat.stat.exists): False 27885 1726882565.74126: when evaluation is False, skipping this task 27885 1726882565.74130: _execute() done 27885 1726882565.74132: dumping result to json 27885 1726882565.74135: done dumping result, returning 27885 1726882565.74141: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-ethtest1 [12673a56-9f93-3fa5-01be-000000000b80] 27885 1726882565.74146: sending task result for task 12673a56-9f93-3fa5-01be-000000000b80 27885 1726882565.74238: done sending task result for task 12673a56-9f93-3fa5-01be-000000000b80 27885 1726882565.74240: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 27885 1726882565.74301: no more pending results, returning what we have 27885 1726882565.74304: results queue empty 27885 1726882565.74305: checking for any_errors_fatal 27885 1726882565.74309: done checking for any_errors_fatal 27885 1726882565.74309: checking for max_fail_percentage 27885 1726882565.74311: done checking for max_fail_percentage 27885 1726882565.74311: checking to see if all hosts have failed and the running result is not ok 27885 1726882565.74312: done checking to see if all hosts have failed 27885 1726882565.74313: getting the remaining hosts for this loop 27885 1726882565.74314: done getting the remaining hosts for this loop 27885 1726882565.74316: getting the next task for host managed_node2 27885 1726882565.74324: done getting next task for host managed_node2 27885 1726882565.74326: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 27885 1726882565.74331: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882565.74334: getting variables 27885 1726882565.74335: in VariableManager get_vars() 27885 1726882565.74365: Calling all_inventory to load vars for managed_node2 27885 1726882565.74367: Calling groups_inventory to load vars for managed_node2 27885 1726882565.74369: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882565.74377: Calling all_plugins_play to load vars for managed_node2 27885 1726882565.74380: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882565.74382: Calling groups_plugins_play to load vars for managed_node2 27885 1726882565.75120: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882565.76080: done with get_vars() 27885 1726882565.76099: done getting variables 27885 1726882565.76140: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27885 1726882565.76212: variable 'profile' from source: include params 27885 1726882565.76215: variable 'item' from source: include params 27885 1726882565.76254: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-ethtest1] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:36:05 -0400 (0:00:00.031) 0:00:38.405 ****** 27885 1726882565.76276: entering _queue_task() for managed_node2/command 27885 1726882565.76479: worker is 1 (out of 1 available) 27885 1726882565.76498: exiting _queue_task() for managed_node2/command 27885 1726882565.76508: done queuing things up, now waiting for results queue to drain 27885 1726882565.76510: waiting for pending results... 27885 1726882565.76678: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-ethtest1 27885 1726882565.76766: in run() - task 12673a56-9f93-3fa5-01be-000000000b81 27885 1726882565.76779: variable 'ansible_search_path' from source: unknown 27885 1726882565.76782: variable 'ansible_search_path' from source: unknown 27885 1726882565.76811: calling self._execute() 27885 1726882565.76885: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882565.76892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882565.76900: variable 'omit' from source: magic vars 27885 1726882565.77152: variable 'ansible_distribution_major_version' from source: facts 27885 1726882565.77161: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882565.77249: variable 'profile_stat' from source: set_fact 27885 1726882565.77258: Evaluated conditional (profile_stat.stat.exists): False 27885 1726882565.77261: when evaluation is False, skipping this task 27885 1726882565.77264: _execute() done 27885 1726882565.77267: dumping result to json 27885 1726882565.77270: done dumping result, returning 27885 1726882565.77279: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-ethtest1 [12673a56-9f93-3fa5-01be-000000000b81] 27885 1726882565.77287: sending task result for task 12673a56-9f93-3fa5-01be-000000000b81 27885 1726882565.77369: done sending task result for task 12673a56-9f93-3fa5-01be-000000000b81 27885 1726882565.77372: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 27885 1726882565.77436: no more pending results, returning what we have 27885 1726882565.77439: results queue empty 27885 1726882565.77440: checking for any_errors_fatal 27885 1726882565.77444: done checking for any_errors_fatal 27885 1726882565.77445: checking for max_fail_percentage 27885 1726882565.77446: done checking for max_fail_percentage 27885 1726882565.77447: checking to see if all hosts have failed and the running result is not ok 27885 1726882565.77448: done checking to see if all hosts have failed 27885 1726882565.77448: getting the remaining hosts for this loop 27885 1726882565.77450: done getting the remaining hosts for this loop 27885 1726882565.77453: getting the next task for host managed_node2 27885 1726882565.77458: done getting next task for host managed_node2 27885 1726882565.77461: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 27885 1726882565.77465: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882565.77469: getting variables 27885 1726882565.77470: in VariableManager get_vars() 27885 1726882565.77502: Calling all_inventory to load vars for managed_node2 27885 1726882565.77505: Calling groups_inventory to load vars for managed_node2 27885 1726882565.77508: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882565.77517: Calling all_plugins_play to load vars for managed_node2 27885 1726882565.77519: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882565.77521: Calling groups_plugins_play to load vars for managed_node2 27885 1726882565.78262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882565.79138: done with get_vars() 27885 1726882565.79152: done getting variables 27885 1726882565.79197: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27885 1726882565.79268: variable 'profile' from source: include params 27885 1726882565.79270: variable 'item' from source: include params 27885 1726882565.79314: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-ethtest1] ************************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:36:05 -0400 (0:00:00.030) 0:00:38.435 ****** 27885 1726882565.79335: entering _queue_task() for managed_node2/set_fact 27885 1726882565.79549: worker is 1 (out of 1 available) 27885 1726882565.79563: exiting _queue_task() for managed_node2/set_fact 27885 1726882565.79576: done queuing things up, now waiting for results queue to drain 27885 1726882565.79577: waiting for pending results... 27885 1726882565.79755: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-ethtest1 27885 1726882565.79845: in run() - task 12673a56-9f93-3fa5-01be-000000000b82 27885 1726882565.79857: variable 'ansible_search_path' from source: unknown 27885 1726882565.79860: variable 'ansible_search_path' from source: unknown 27885 1726882565.79887: calling self._execute() 27885 1726882565.79968: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882565.79972: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882565.79980: variable 'omit' from source: magic vars 27885 1726882565.80248: variable 'ansible_distribution_major_version' from source: facts 27885 1726882565.80261: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882565.80345: variable 'profile_stat' from source: set_fact 27885 1726882565.80351: Evaluated conditional (profile_stat.stat.exists): False 27885 1726882565.80354: when evaluation is False, skipping this task 27885 1726882565.80357: _execute() done 27885 1726882565.80363: dumping result to json 27885 1726882565.80366: done dumping result, returning 27885 1726882565.80374: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-ethtest1 [12673a56-9f93-3fa5-01be-000000000b82] 27885 1726882565.80377: sending task result for task 12673a56-9f93-3fa5-01be-000000000b82 27885 1726882565.80463: done sending task result for task 12673a56-9f93-3fa5-01be-000000000b82 27885 1726882565.80466: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 27885 1726882565.80511: no more pending results, returning what we have 27885 1726882565.80516: results queue empty 27885 1726882565.80517: checking for any_errors_fatal 27885 1726882565.80524: done checking for any_errors_fatal 27885 1726882565.80525: checking for max_fail_percentage 27885 1726882565.80526: done checking for max_fail_percentage 27885 1726882565.80527: checking to see if all hosts have failed and the running result is not ok 27885 1726882565.80528: done checking to see if all hosts have failed 27885 1726882565.80528: getting the remaining hosts for this loop 27885 1726882565.80530: done getting the remaining hosts for this loop 27885 1726882565.80533: getting the next task for host managed_node2 27885 1726882565.80542: done getting next task for host managed_node2 27885 1726882565.80545: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 27885 1726882565.80549: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882565.80553: getting variables 27885 1726882565.80554: in VariableManager get_vars() 27885 1726882565.80587: Calling all_inventory to load vars for managed_node2 27885 1726882565.80590: Calling groups_inventory to load vars for managed_node2 27885 1726882565.80592: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882565.80608: Calling all_plugins_play to load vars for managed_node2 27885 1726882565.80611: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882565.80613: Calling groups_plugins_play to load vars for managed_node2 27885 1726882565.81517: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882565.82368: done with get_vars() 27885 1726882565.82384: done getting variables 27885 1726882565.82428: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27885 1726882565.82510: variable 'profile' from source: include params 27885 1726882565.82513: variable 'item' from source: include params 27885 1726882565.82553: variable 'item' from source: include params TASK [Assert that the profile is absent - 'ethtest1'] ************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Friday 20 September 2024 21:36:05 -0400 (0:00:00.032) 0:00:38.468 ****** 27885 1726882565.82576: entering _queue_task() for managed_node2/assert 27885 1726882565.82808: worker is 1 (out of 1 available) 27885 1726882565.82823: exiting _queue_task() for managed_node2/assert 27885 1726882565.82836: done queuing things up, now waiting for results queue to drain 27885 1726882565.82838: waiting for pending results... 27885 1726882565.83015: running TaskExecutor() for managed_node2/TASK: Assert that the profile is absent - 'ethtest1' 27885 1726882565.83096: in run() - task 12673a56-9f93-3fa5-01be-000000000a72 27885 1726882565.83109: variable 'ansible_search_path' from source: unknown 27885 1726882565.83113: variable 'ansible_search_path' from source: unknown 27885 1726882565.83140: calling self._execute() 27885 1726882565.83223: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882565.83227: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882565.83235: variable 'omit' from source: magic vars 27885 1726882565.83503: variable 'ansible_distribution_major_version' from source: facts 27885 1726882565.83513: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882565.83519: variable 'omit' from source: magic vars 27885 1726882565.83550: variable 'omit' from source: magic vars 27885 1726882565.83619: variable 'profile' from source: include params 27885 1726882565.83623: variable 'item' from source: include params 27885 1726882565.83667: variable 'item' from source: include params 27885 1726882565.83682: variable 'omit' from source: magic vars 27885 1726882565.83720: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882565.83748: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882565.83763: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882565.83777: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882565.83786: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882565.83814: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882565.83817: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882565.83821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882565.83888: Set connection var ansible_pipelining to False 27885 1726882565.83897: Set connection var ansible_connection to ssh 27885 1726882565.83903: Set connection var ansible_timeout to 10 27885 1726882565.83905: Set connection var ansible_shell_type to sh 27885 1726882565.83910: Set connection var ansible_shell_executable to /bin/sh 27885 1726882565.83915: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882565.83938: variable 'ansible_shell_executable' from source: unknown 27885 1726882565.83943: variable 'ansible_connection' from source: unknown 27885 1726882565.83945: variable 'ansible_module_compression' from source: unknown 27885 1726882565.83947: variable 'ansible_shell_type' from source: unknown 27885 1726882565.83950: variable 'ansible_shell_executable' from source: unknown 27885 1726882565.83952: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882565.83954: variable 'ansible_pipelining' from source: unknown 27885 1726882565.83956: variable 'ansible_timeout' from source: unknown 27885 1726882565.83959: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882565.84053: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882565.84063: variable 'omit' from source: magic vars 27885 1726882565.84070: starting attempt loop 27885 1726882565.84072: running the handler 27885 1726882565.84156: variable 'lsr_net_profile_exists' from source: set_fact 27885 1726882565.84160: Evaluated conditional (not lsr_net_profile_exists): True 27885 1726882565.84162: handler run complete 27885 1726882565.84174: attempt loop complete, returning result 27885 1726882565.84177: _execute() done 27885 1726882565.84179: dumping result to json 27885 1726882565.84184: done dumping result, returning 27885 1726882565.84192: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is absent - 'ethtest1' [12673a56-9f93-3fa5-01be-000000000a72] 27885 1726882565.84199: sending task result for task 12673a56-9f93-3fa5-01be-000000000a72 27885 1726882565.84283: done sending task result for task 12673a56-9f93-3fa5-01be-000000000a72 27885 1726882565.84286: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 27885 1726882565.84329: no more pending results, returning what we have 27885 1726882565.84332: results queue empty 27885 1726882565.84333: checking for any_errors_fatal 27885 1726882565.84341: done checking for any_errors_fatal 27885 1726882565.84342: checking for max_fail_percentage 27885 1726882565.84343: done checking for max_fail_percentage 27885 1726882565.84344: checking to see if all hosts have failed and the running result is not ok 27885 1726882565.84345: done checking to see if all hosts have failed 27885 1726882565.84345: getting the remaining hosts for this loop 27885 1726882565.84347: done getting the remaining hosts for this loop 27885 1726882565.84350: getting the next task for host managed_node2 27885 1726882565.84359: done getting next task for host managed_node2 27885 1726882565.84362: ^ task is: TASK: Verify network state restored to default 27885 1726882565.84367: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882565.84372: getting variables 27885 1726882565.84373: in VariableManager get_vars() 27885 1726882565.84409: Calling all_inventory to load vars for managed_node2 27885 1726882565.84412: Calling groups_inventory to load vars for managed_node2 27885 1726882565.84415: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882565.84423: Calling all_plugins_play to load vars for managed_node2 27885 1726882565.84426: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882565.84428: Calling groups_plugins_play to load vars for managed_node2 27885 1726882565.85205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882565.86083: done with get_vars() 27885 1726882565.86103: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:169 Friday 20 September 2024 21:36:05 -0400 (0:00:00.035) 0:00:38.504 ****** 27885 1726882565.86167: entering _queue_task() for managed_node2/include_tasks 27885 1726882565.86376: worker is 1 (out of 1 available) 27885 1726882565.86392: exiting _queue_task() for managed_node2/include_tasks 27885 1726882565.86407: done queuing things up, now waiting for results queue to drain 27885 1726882565.86409: waiting for pending results... 27885 1726882565.86583: running TaskExecutor() for managed_node2/TASK: Verify network state restored to default 27885 1726882565.86655: in run() - task 12673a56-9f93-3fa5-01be-0000000000bb 27885 1726882565.86667: variable 'ansible_search_path' from source: unknown 27885 1726882565.86697: calling self._execute() 27885 1726882565.86782: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882565.86789: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882565.86796: variable 'omit' from source: magic vars 27885 1726882565.87068: variable 'ansible_distribution_major_version' from source: facts 27885 1726882565.87079: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882565.87083: _execute() done 27885 1726882565.87088: dumping result to json 27885 1726882565.87091: done dumping result, returning 27885 1726882565.87096: done running TaskExecutor() for managed_node2/TASK: Verify network state restored to default [12673a56-9f93-3fa5-01be-0000000000bb] 27885 1726882565.87102: sending task result for task 12673a56-9f93-3fa5-01be-0000000000bb 27885 1726882565.87183: done sending task result for task 12673a56-9f93-3fa5-01be-0000000000bb 27885 1726882565.87189: WORKER PROCESS EXITING 27885 1726882565.87218: no more pending results, returning what we have 27885 1726882565.87223: in VariableManager get_vars() 27885 1726882565.87265: Calling all_inventory to load vars for managed_node2 27885 1726882565.87268: Calling groups_inventory to load vars for managed_node2 27885 1726882565.87270: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882565.87280: Calling all_plugins_play to load vars for managed_node2 27885 1726882565.87282: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882565.87284: Calling groups_plugins_play to load vars for managed_node2 27885 1726882565.88181: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882565.89038: done with get_vars() 27885 1726882565.89051: variable 'ansible_search_path' from source: unknown 27885 1726882565.89061: we have included files to process 27885 1726882565.89062: generating all_blocks data 27885 1726882565.89063: done generating all_blocks data 27885 1726882565.89068: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 27885 1726882565.89068: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 27885 1726882565.89070: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 27885 1726882565.89333: done processing included file 27885 1726882565.89334: iterating over new_blocks loaded from include file 27885 1726882565.89335: in VariableManager get_vars() 27885 1726882565.89347: done with get_vars() 27885 1726882565.89348: filtering new block on tags 27885 1726882565.89372: done filtering new block on tags 27885 1726882565.89374: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node2 27885 1726882565.89378: extending task lists for all hosts with included blocks 27885 1726882565.90429: done extending task lists 27885 1726882565.90430: done processing included files 27885 1726882565.90431: results queue empty 27885 1726882565.90431: checking for any_errors_fatal 27885 1726882565.90434: done checking for any_errors_fatal 27885 1726882565.90434: checking for max_fail_percentage 27885 1726882565.90435: done checking for max_fail_percentage 27885 1726882565.90436: checking to see if all hosts have failed and the running result is not ok 27885 1726882565.90436: done checking to see if all hosts have failed 27885 1726882565.90437: getting the remaining hosts for this loop 27885 1726882565.90438: done getting the remaining hosts for this loop 27885 1726882565.90440: getting the next task for host managed_node2 27885 1726882565.90443: done getting next task for host managed_node2 27885 1726882565.90445: ^ task is: TASK: Check routes and DNS 27885 1726882565.90446: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882565.90449: getting variables 27885 1726882565.90449: in VariableManager get_vars() 27885 1726882565.90459: Calling all_inventory to load vars for managed_node2 27885 1726882565.90460: Calling groups_inventory to load vars for managed_node2 27885 1726882565.90461: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882565.90465: Calling all_plugins_play to load vars for managed_node2 27885 1726882565.90467: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882565.90468: Calling groups_plugins_play to load vars for managed_node2 27885 1726882565.91096: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882565.92020: done with get_vars() 27885 1726882565.92034: done getting variables 27885 1726882565.92064: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 21:36:05 -0400 (0:00:00.059) 0:00:38.563 ****** 27885 1726882565.92088: entering _queue_task() for managed_node2/shell 27885 1726882565.92352: worker is 1 (out of 1 available) 27885 1726882565.92365: exiting _queue_task() for managed_node2/shell 27885 1726882565.92378: done queuing things up, now waiting for results queue to drain 27885 1726882565.92380: waiting for pending results... 27885 1726882565.92562: running TaskExecutor() for managed_node2/TASK: Check routes and DNS 27885 1726882565.92635: in run() - task 12673a56-9f93-3fa5-01be-000000000bb6 27885 1726882565.92647: variable 'ansible_search_path' from source: unknown 27885 1726882565.92651: variable 'ansible_search_path' from source: unknown 27885 1726882565.92679: calling self._execute() 27885 1726882565.92761: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882565.92765: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882565.92774: variable 'omit' from source: magic vars 27885 1726882565.93048: variable 'ansible_distribution_major_version' from source: facts 27885 1726882565.93056: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882565.93062: variable 'omit' from source: magic vars 27885 1726882565.93094: variable 'omit' from source: magic vars 27885 1726882565.93117: variable 'omit' from source: magic vars 27885 1726882565.93150: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882565.93177: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882565.93196: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882565.93210: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882565.93222: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882565.93245: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882565.93248: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882565.93251: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882565.93323: Set connection var ansible_pipelining to False 27885 1726882565.93327: Set connection var ansible_connection to ssh 27885 1726882565.93332: Set connection var ansible_timeout to 10 27885 1726882565.93335: Set connection var ansible_shell_type to sh 27885 1726882565.93340: Set connection var ansible_shell_executable to /bin/sh 27885 1726882565.93344: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882565.93362: variable 'ansible_shell_executable' from source: unknown 27885 1726882565.93365: variable 'ansible_connection' from source: unknown 27885 1726882565.93370: variable 'ansible_module_compression' from source: unknown 27885 1726882565.93373: variable 'ansible_shell_type' from source: unknown 27885 1726882565.93376: variable 'ansible_shell_executable' from source: unknown 27885 1726882565.93378: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882565.93380: variable 'ansible_pipelining' from source: unknown 27885 1726882565.93383: variable 'ansible_timeout' from source: unknown 27885 1726882565.93385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882565.93482: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882565.93492: variable 'omit' from source: magic vars 27885 1726882565.93505: starting attempt loop 27885 1726882565.93508: running the handler 27885 1726882565.93511: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882565.93531: _low_level_execute_command(): starting 27885 1726882565.93537: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27885 1726882565.94060: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882565.94064: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882565.94068: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882565.94119: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882565.94122: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882565.94125: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882565.94198: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882565.95850: stdout chunk (state=3): >>>/root <<< 27885 1726882565.95947: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882565.95977: stderr chunk (state=3): >>><<< 27885 1726882565.95983: stdout chunk (state=3): >>><<< 27885 1726882565.96012: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882565.96024: _low_level_execute_command(): starting 27885 1726882565.96030: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882565.9601135-29632-65700320877518 `" && echo ansible-tmp-1726882565.9601135-29632-65700320877518="` echo /root/.ansible/tmp/ansible-tmp-1726882565.9601135-29632-65700320877518 `" ) && sleep 0' 27885 1726882565.96454: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882565.96461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882565.96490: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882565.96498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882565.96544: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882565.96563: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882565.96620: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882565.98500: stdout chunk (state=3): >>>ansible-tmp-1726882565.9601135-29632-65700320877518=/root/.ansible/tmp/ansible-tmp-1726882565.9601135-29632-65700320877518 <<< 27885 1726882565.98612: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882565.98640: stderr chunk (state=3): >>><<< 27885 1726882565.98644: stdout chunk (state=3): >>><<< 27885 1726882565.98658: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882565.9601135-29632-65700320877518=/root/.ansible/tmp/ansible-tmp-1726882565.9601135-29632-65700320877518 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882565.98687: variable 'ansible_module_compression' from source: unknown 27885 1726882565.98734: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27885_1t3g5hz/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27885 1726882565.98765: variable 'ansible_facts' from source: unknown 27885 1726882565.98821: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882565.9601135-29632-65700320877518/AnsiballZ_command.py 27885 1726882565.98927: Sending initial data 27885 1726882565.98930: Sent initial data (155 bytes) 27885 1726882565.99361: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882565.99398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882565.99402: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 27885 1726882565.99404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882565.99406: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882565.99408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 27885 1726882565.99410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882565.99458: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882565.99461: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882565.99524: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882566.01048: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 27885 1726882566.01055: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27885 1726882566.01112: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27885 1726882566.01176: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmp1cxn2uby /root/.ansible/tmp/ansible-tmp-1726882565.9601135-29632-65700320877518/AnsiballZ_command.py <<< 27885 1726882566.01179: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882565.9601135-29632-65700320877518/AnsiballZ_command.py" <<< 27885 1726882566.01234: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmp1cxn2uby" to remote "/root/.ansible/tmp/ansible-tmp-1726882565.9601135-29632-65700320877518/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882565.9601135-29632-65700320877518/AnsiballZ_command.py" <<< 27885 1726882566.01831: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882566.01874: stderr chunk (state=3): >>><<< 27885 1726882566.01878: stdout chunk (state=3): >>><<< 27885 1726882566.01913: done transferring module to remote 27885 1726882566.01923: _low_level_execute_command(): starting 27885 1726882566.01928: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882565.9601135-29632-65700320877518/ /root/.ansible/tmp/ansible-tmp-1726882565.9601135-29632-65700320877518/AnsiballZ_command.py && sleep 0' 27885 1726882566.02372: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882566.02411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882566.02415: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882566.02422: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882566.02424: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882566.02464: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882566.02467: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882566.02537: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882566.04266: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882566.04294: stderr chunk (state=3): >>><<< 27885 1726882566.04298: stdout chunk (state=3): >>><<< 27885 1726882566.04313: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882566.04320: _low_level_execute_command(): starting 27885 1726882566.04325: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882565.9601135-29632-65700320877518/AnsiballZ_command.py && sleep 0' 27885 1726882566.04765: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882566.04769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882566.04806: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 27885 1726882566.04809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 27885 1726882566.04811: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882566.04814: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882566.04862: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882566.04865: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882566.04875: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882566.04950: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882566.20825: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:c1:46:63:3b brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.14.69/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3105sec preferred_lft 3105sec\n inet6 fe80::8ff:c1ff:fe46:633b/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\n24: rpltstbr: mtu 1500 qdisc noqueue state DOWN group default qlen 1000\n link/ether 6e:57:f6:54:9a:30 brd ff:ff:ff:ff:ff:ff\n inet 192.0.2.72/31 scope global noprefixroute rpltstbr\n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.69 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.69 metric 100 \n192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:36:06.198878", "end": "2024-09-20 21:36:06.207183", "delta": "0:00:00.008305", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}}<<< 27885 1726882566.20829: stdout chunk (state=3): >>> <<< 27885 1726882566.22187: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 27885 1726882566.22221: stderr chunk (state=3): >>><<< 27885 1726882566.22224: stdout chunk (state=3): >>><<< 27885 1726882566.22244: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:c1:46:63:3b brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.14.69/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3105sec preferred_lft 3105sec\n inet6 fe80::8ff:c1ff:fe46:633b/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\n24: rpltstbr: mtu 1500 qdisc noqueue state DOWN group default qlen 1000\n link/ether 6e:57:f6:54:9a:30 brd ff:ff:ff:ff:ff:ff\n inet 192.0.2.72/31 scope global noprefixroute rpltstbr\n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.69 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.69 metric 100 \n192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:36:06.198878", "end": "2024-09-20 21:36:06.207183", "delta": "0:00:00.008305", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 27885 1726882566.22280: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882565.9601135-29632-65700320877518/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27885 1726882566.22289: _low_level_execute_command(): starting 27885 1726882566.22292: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882565.9601135-29632-65700320877518/ > /dev/null 2>&1 && sleep 0' 27885 1726882566.22754: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882566.22757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 27885 1726882566.22760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration <<< 27885 1726882566.22762: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 27885 1726882566.22764: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882566.22819: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882566.22823: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882566.22825: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882566.22891: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882566.24682: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882566.24714: stderr chunk (state=3): >>><<< 27885 1726882566.24717: stdout chunk (state=3): >>><<< 27885 1726882566.24731: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882566.24737: handler run complete 27885 1726882566.24753: Evaluated conditional (False): False 27885 1726882566.24761: attempt loop complete, returning result 27885 1726882566.24764: _execute() done 27885 1726882566.24766: dumping result to json 27885 1726882566.24772: done dumping result, returning 27885 1726882566.24779: done running TaskExecutor() for managed_node2/TASK: Check routes and DNS [12673a56-9f93-3fa5-01be-000000000bb6] 27885 1726882566.24784: sending task result for task 12673a56-9f93-3fa5-01be-000000000bb6 27885 1726882566.24892: done sending task result for task 12673a56-9f93-3fa5-01be-000000000bb6 27885 1726882566.24897: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008305", "end": "2024-09-20 21:36:06.207183", "rc": 0, "start": "2024-09-20 21:36:06.198878" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 0a:ff:c1:46:63:3b brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.14.69/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0 valid_lft 3105sec preferred_lft 3105sec inet6 fe80::8ff:c1ff:fe46:633b/64 scope link noprefixroute valid_lft forever preferred_lft forever 24: rpltstbr: mtu 1500 qdisc noqueue state DOWN group default qlen 1000 link/ether 6e:57:f6:54:9a:30 brd ff:ff:ff:ff:ff:ff inet 192.0.2.72/31 scope global noprefixroute rpltstbr valid_lft forever preferred_lft forever IP ROUTE default via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.69 metric 100 10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.69 metric 100 192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 27885 1726882566.24974: no more pending results, returning what we have 27885 1726882566.24978: results queue empty 27885 1726882566.24979: checking for any_errors_fatal 27885 1726882566.24980: done checking for any_errors_fatal 27885 1726882566.24981: checking for max_fail_percentage 27885 1726882566.24982: done checking for max_fail_percentage 27885 1726882566.24983: checking to see if all hosts have failed and the running result is not ok 27885 1726882566.24984: done checking to see if all hosts have failed 27885 1726882566.24985: getting the remaining hosts for this loop 27885 1726882566.24988: done getting the remaining hosts for this loop 27885 1726882566.24992: getting the next task for host managed_node2 27885 1726882566.25001: done getting next task for host managed_node2 27885 1726882566.25004: ^ task is: TASK: Verify DNS and network connectivity 27885 1726882566.25008: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27885 1726882566.25017: getting variables 27885 1726882566.25019: in VariableManager get_vars() 27885 1726882566.25056: Calling all_inventory to load vars for managed_node2 27885 1726882566.25058: Calling groups_inventory to load vars for managed_node2 27885 1726882566.25060: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882566.25070: Calling all_plugins_play to load vars for managed_node2 27885 1726882566.25072: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882566.25075: Calling groups_plugins_play to load vars for managed_node2 27885 1726882566.25884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882566.26759: done with get_vars() 27885 1726882566.26775: done getting variables 27885 1726882566.26823: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 21:36:06 -0400 (0:00:00.347) 0:00:38.910 ****** 27885 1726882566.26846: entering _queue_task() for managed_node2/shell 27885 1726882566.27071: worker is 1 (out of 1 available) 27885 1726882566.27084: exiting _queue_task() for managed_node2/shell 27885 1726882566.27102: done queuing things up, now waiting for results queue to drain 27885 1726882566.27104: waiting for pending results... 27885 1726882566.27273: running TaskExecutor() for managed_node2/TASK: Verify DNS and network connectivity 27885 1726882566.27353: in run() - task 12673a56-9f93-3fa5-01be-000000000bb7 27885 1726882566.27365: variable 'ansible_search_path' from source: unknown 27885 1726882566.27369: variable 'ansible_search_path' from source: unknown 27885 1726882566.27398: calling self._execute() 27885 1726882566.27475: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882566.27478: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882566.27490: variable 'omit' from source: magic vars 27885 1726882566.27752: variable 'ansible_distribution_major_version' from source: facts 27885 1726882566.27763: Evaluated conditional (ansible_distribution_major_version != '6'): True 27885 1726882566.27854: variable 'ansible_facts' from source: unknown 27885 1726882566.28291: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 27885 1726882566.28296: variable 'omit' from source: magic vars 27885 1726882566.28330: variable 'omit' from source: magic vars 27885 1726882566.28352: variable 'omit' from source: magic vars 27885 1726882566.28383: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27885 1726882566.28413: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27885 1726882566.28432: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27885 1726882566.28444: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882566.28454: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27885 1726882566.28477: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27885 1726882566.28480: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882566.28483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882566.28555: Set connection var ansible_pipelining to False 27885 1726882566.28559: Set connection var ansible_connection to ssh 27885 1726882566.28564: Set connection var ansible_timeout to 10 27885 1726882566.28567: Set connection var ansible_shell_type to sh 27885 1726882566.28572: Set connection var ansible_shell_executable to /bin/sh 27885 1726882566.28577: Set connection var ansible_module_compression to ZIP_DEFLATED 27885 1726882566.28597: variable 'ansible_shell_executable' from source: unknown 27885 1726882566.28600: variable 'ansible_connection' from source: unknown 27885 1726882566.28603: variable 'ansible_module_compression' from source: unknown 27885 1726882566.28605: variable 'ansible_shell_type' from source: unknown 27885 1726882566.28607: variable 'ansible_shell_executable' from source: unknown 27885 1726882566.28609: variable 'ansible_host' from source: host vars for 'managed_node2' 27885 1726882566.28614: variable 'ansible_pipelining' from source: unknown 27885 1726882566.28616: variable 'ansible_timeout' from source: unknown 27885 1726882566.28620: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27885 1726882566.28717: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882566.28725: variable 'omit' from source: magic vars 27885 1726882566.28730: starting attempt loop 27885 1726882566.28733: running the handler 27885 1726882566.28742: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27885 1726882566.28761: _low_level_execute_command(): starting 27885 1726882566.28767: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27885 1726882566.29278: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882566.29282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882566.29285: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882566.29287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882566.29345: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882566.29348: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882566.29350: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882566.29420: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882566.30989: stdout chunk (state=3): >>>/root <<< 27885 1726882566.31083: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882566.31113: stderr chunk (state=3): >>><<< 27885 1726882566.31116: stdout chunk (state=3): >>><<< 27885 1726882566.31140: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882566.31152: _low_level_execute_command(): starting 27885 1726882566.31158: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882566.3114014-29640-277784769426093 `" && echo ansible-tmp-1726882566.3114014-29640-277784769426093="` echo /root/.ansible/tmp/ansible-tmp-1726882566.3114014-29640-277784769426093 `" ) && sleep 0' 27885 1726882566.31582: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882566.31585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882566.31590: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882566.31592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882566.31642: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882566.31645: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882566.31718: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882566.33559: stdout chunk (state=3): >>>ansible-tmp-1726882566.3114014-29640-277784769426093=/root/.ansible/tmp/ansible-tmp-1726882566.3114014-29640-277784769426093 <<< 27885 1726882566.33674: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882566.33698: stderr chunk (state=3): >>><<< 27885 1726882566.33701: stdout chunk (state=3): >>><<< 27885 1726882566.33716: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882566.3114014-29640-277784769426093=/root/.ansible/tmp/ansible-tmp-1726882566.3114014-29640-277784769426093 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882566.33740: variable 'ansible_module_compression' from source: unknown 27885 1726882566.33782: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27885_1t3g5hz/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27885 1726882566.33815: variable 'ansible_facts' from source: unknown 27885 1726882566.33862: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882566.3114014-29640-277784769426093/AnsiballZ_command.py 27885 1726882566.33958: Sending initial data 27885 1726882566.33964: Sent initial data (156 bytes) 27885 1726882566.34361: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882566.34368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882566.34390: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882566.34396: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882566.34399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882566.34456: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882566.34462: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882566.34464: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882566.34524: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882566.36041: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27885 1726882566.36099: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27885 1726882566.36160: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmphni1juz_ /root/.ansible/tmp/ansible-tmp-1726882566.3114014-29640-277784769426093/AnsiballZ_command.py <<< 27885 1726882566.36166: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882566.3114014-29640-277784769426093/AnsiballZ_command.py" <<< 27885 1726882566.36220: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-27885_1t3g5hz/tmphni1juz_" to remote "/root/.ansible/tmp/ansible-tmp-1726882566.3114014-29640-277784769426093/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882566.3114014-29640-277784769426093/AnsiballZ_command.py" <<< 27885 1726882566.36844: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882566.36876: stderr chunk (state=3): >>><<< 27885 1726882566.36879: stdout chunk (state=3): >>><<< 27885 1726882566.36922: done transferring module to remote 27885 1726882566.36931: _low_level_execute_command(): starting 27885 1726882566.36934: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882566.3114014-29640-277784769426093/ /root/.ansible/tmp/ansible-tmp-1726882566.3114014-29640-277784769426093/AnsiballZ_command.py && sleep 0' 27885 1726882566.37356: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882566.37360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882566.37366: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882566.37368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 27885 1726882566.37370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882566.37411: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882566.37414: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882566.37480: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882566.39183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882566.39207: stderr chunk (state=3): >>><<< 27885 1726882566.39210: stdout chunk (state=3): >>><<< 27885 1726882566.39222: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882566.39226: _low_level_execute_command(): starting 27885 1726882566.39229: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882566.3114014-29640-277784769426093/AnsiballZ_command.py && sleep 0' 27885 1726882566.39639: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882566.39642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882566.39645: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882566.39647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27885 1726882566.39687: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 27885 1726882566.39691: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882566.39768: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882566.65363: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 15023 0 --:--:-- --:--:-- --:--:-- 15250\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 4612 0 --:--:-- --:--:-- --:--:-- 4619", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:36:06.546391", "end": "2024-09-20 21:36:06.652536", "delta": "0:00:00.106145", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27885 1726882566.66989: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 27885 1726882566.66998: stdout chunk (state=3): >>><<< 27885 1726882566.67001: stderr chunk (state=3): >>><<< 27885 1726882566.67098: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 15023 0 --:--:-- --:--:-- --:--:-- 15250\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 4612 0 --:--:-- --:--:-- --:--:-- 4619", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:36:06.546391", "end": "2024-09-20 21:36:06.652536", "delta": "0:00:00.106145", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 27885 1726882566.67108: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882566.3114014-29640-277784769426093/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27885 1726882566.67111: _low_level_execute_command(): starting 27885 1726882566.67114: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882566.3114014-29640-277784769426093/ > /dev/null 2>&1 && sleep 0' 27885 1726882566.67898: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27885 1726882566.67902: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27885 1726882566.67904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27885 1726882566.67906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882566.67912: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27885 1726882566.67916: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 27885 1726882566.67919: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 27885 1726882566.67936: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27885 1726882566.68029: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27885 1726882566.69823: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27885 1726882566.69849: stderr chunk (state=3): >>><<< 27885 1726882566.69852: stdout chunk (state=3): >>><<< 27885 1726882566.69866: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27885 1726882566.69872: handler run complete 27885 1726882566.69899: Evaluated conditional (False): False 27885 1726882566.69907: attempt loop complete, returning result 27885 1726882566.69909: _execute() done 27885 1726882566.69912: dumping result to json 27885 1726882566.69917: done dumping result, returning 27885 1726882566.69925: done running TaskExecutor() for managed_node2/TASK: Verify DNS and network connectivity [12673a56-9f93-3fa5-01be-000000000bb7] 27885 1726882566.69930: sending task result for task 12673a56-9f93-3fa5-01be-000000000bb7 27885 1726882566.70031: done sending task result for task 12673a56-9f93-3fa5-01be-000000000bb7 27885 1726882566.70034: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.106145", "end": "2024-09-20 21:36:06.652536", "rc": 0, "start": "2024-09-20 21:36:06.546391" } STDOUT: CHECK DNS AND CONNECTIVITY 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 15023 0 --:--:-- --:--:-- --:--:-- 15250 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 4612 0 --:--:-- --:--:-- --:--:-- 4619 27885 1726882566.70103: no more pending results, returning what we have 27885 1726882566.70107: results queue empty 27885 1726882566.70108: checking for any_errors_fatal 27885 1726882566.70118: done checking for any_errors_fatal 27885 1726882566.70118: checking for max_fail_percentage 27885 1726882566.70120: done checking for max_fail_percentage 27885 1726882566.70121: checking to see if all hosts have failed and the running result is not ok 27885 1726882566.70122: done checking to see if all hosts have failed 27885 1726882566.70122: getting the remaining hosts for this loop 27885 1726882566.70124: done getting the remaining hosts for this loop 27885 1726882566.70131: getting the next task for host managed_node2 27885 1726882566.70140: done getting next task for host managed_node2 27885 1726882566.70142: ^ task is: TASK: meta (flush_handlers) 27885 1726882566.70146: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882566.70151: getting variables 27885 1726882566.70153: in VariableManager get_vars() 27885 1726882566.70192: Calling all_inventory to load vars for managed_node2 27885 1726882566.70201: Calling groups_inventory to load vars for managed_node2 27885 1726882566.70204: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882566.70214: Calling all_plugins_play to load vars for managed_node2 27885 1726882566.70217: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882566.70219: Calling groups_plugins_play to load vars for managed_node2 27885 1726882566.71202: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882566.72058: done with get_vars() 27885 1726882566.72074: done getting variables 27885 1726882566.72126: in VariableManager get_vars() 27885 1726882566.72137: Calling all_inventory to load vars for managed_node2 27885 1726882566.72138: Calling groups_inventory to load vars for managed_node2 27885 1726882566.72140: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882566.72144: Calling all_plugins_play to load vars for managed_node2 27885 1726882566.72147: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882566.72149: Calling groups_plugins_play to load vars for managed_node2 27885 1726882566.72822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882566.73741: done with get_vars() 27885 1726882566.73759: done queuing things up, now waiting for results queue to drain 27885 1726882566.73760: results queue empty 27885 1726882566.73761: checking for any_errors_fatal 27885 1726882566.73763: done checking for any_errors_fatal 27885 1726882566.73763: checking for max_fail_percentage 27885 1726882566.73764: done checking for max_fail_percentage 27885 1726882566.73765: checking to see if all hosts have failed and the running result is not ok 27885 1726882566.73765: done checking to see if all hosts have failed 27885 1726882566.73766: getting the remaining hosts for this loop 27885 1726882566.73766: done getting the remaining hosts for this loop 27885 1726882566.73768: getting the next task for host managed_node2 27885 1726882566.73771: done getting next task for host managed_node2 27885 1726882566.73772: ^ task is: TASK: meta (flush_handlers) 27885 1726882566.73773: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882566.73775: getting variables 27885 1726882566.73775: in VariableManager get_vars() 27885 1726882566.73783: Calling all_inventory to load vars for managed_node2 27885 1726882566.73785: Calling groups_inventory to load vars for managed_node2 27885 1726882566.73786: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882566.73790: Calling all_plugins_play to load vars for managed_node2 27885 1726882566.73792: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882566.73795: Calling groups_plugins_play to load vars for managed_node2 27885 1726882566.74594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882566.76170: done with get_vars() 27885 1726882566.76187: done getting variables 27885 1726882566.76224: in VariableManager get_vars() 27885 1726882566.76234: Calling all_inventory to load vars for managed_node2 27885 1726882566.76235: Calling groups_inventory to load vars for managed_node2 27885 1726882566.76237: Calling all_plugins_inventory to load vars for managed_node2 27885 1726882566.76240: Calling all_plugins_play to load vars for managed_node2 27885 1726882566.76241: Calling groups_plugins_inventory to load vars for managed_node2 27885 1726882566.76243: Calling groups_plugins_play to load vars for managed_node2 27885 1726882566.76875: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27885 1726882566.77919: done with get_vars() 27885 1726882566.77942: done queuing things up, now waiting for results queue to drain 27885 1726882566.77944: results queue empty 27885 1726882566.77945: checking for any_errors_fatal 27885 1726882566.77946: done checking for any_errors_fatal 27885 1726882566.77947: checking for max_fail_percentage 27885 1726882566.77948: done checking for max_fail_percentage 27885 1726882566.77948: checking to see if all hosts have failed and the running result is not ok 27885 1726882566.77949: done checking to see if all hosts have failed 27885 1726882566.77950: getting the remaining hosts for this loop 27885 1726882566.77951: done getting the remaining hosts for this loop 27885 1726882566.77953: getting the next task for host managed_node2 27885 1726882566.77956: done getting next task for host managed_node2 27885 1726882566.77957: ^ task is: None 27885 1726882566.77958: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27885 1726882566.77959: done queuing things up, now waiting for results queue to drain 27885 1726882566.77960: results queue empty 27885 1726882566.77961: checking for any_errors_fatal 27885 1726882566.77961: done checking for any_errors_fatal 27885 1726882566.77962: checking for max_fail_percentage 27885 1726882566.77963: done checking for max_fail_percentage 27885 1726882566.77964: checking to see if all hosts have failed and the running result is not ok 27885 1726882566.77964: done checking to see if all hosts have failed 27885 1726882566.77966: getting the next task for host managed_node2 27885 1726882566.77969: done getting next task for host managed_node2 27885 1726882566.77969: ^ task is: None 27885 1726882566.77970: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node2 : ok=108 changed=3 unreachable=0 failed=0 skipped=87 rescued=0 ignored=2 Friday 20 September 2024 21:36:06 -0400 (0:00:00.511) 0:00:39.422 ****** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 1.91s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.82s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.72s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.62s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which packages are installed --- 1.33s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.31s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_device_nm.yml:6 Create veth interface ethtest0 ------------------------------------------ 1.09s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Create veth interface ethtest1 ------------------------------------------ 1.07s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.04s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Gathering Facts --------------------------------------------------------- 0.92s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 0.90s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Install iproute --------------------------------------------------------- 0.82s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.81s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.73s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Check which packages are installed --- 0.71s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Install iproute --------------------------------------------------------- 0.65s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.62s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.57s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.55s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Gather the minimum subset of ansible_facts required by the network role test --- 0.55s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 27885 1726882566.78099: RUNNING CLEANUP